How Much Data Can You Load into Salesforce in One Hour?
If you’ve read the many blog posts, wikis, webinars, and other content from the Customer Centric Engineering-Technical Enablement team, then you already know how we help customers investigate and solve challenging problems, take what we learn, and teach others how to avoid similar problems. The idea is to make tough roads easier to travel for everyone else that comes next.
Loading extreme amounts of data into the Salesforce1 Platform is a problem area where we often see customers struggle, especially when trying to take advantage of the Bulk API’s parallel loading feature with large volumes of data. So as part of a recent customer investigation, Sean Regan did an incredible job testing, benchmarking, and documenting all sorts of ways that you can set up bulk data loads for the Salesforce1 Platform.
On Wednesday, Feb 26, 10am Pacific, Sean and I will be presenting the result of his efforts in a live webinar: Fast Parallel Data Loading with the Bulk API. We’ll give you some general concepts to start with, then run through several data load case studies, including live data loads. Along the way, you’ll learn why research, investigation, and planning is important to avoid parallel loads that run more like serial loads because of various throughput inhibitors, and easy ways to workaround such problems. In summary, this webinar will help you set up data loads that rip.
As always, our team will be on hand to answer live questions you need answers to. Please join us.
About the Author
Steve Bobrowski is an Architect Evangelist within the Technical Enablement team of the salesforce.com Customer-Centric Engineering group. The team’s mission is to help customers understand how to implement technically sound Salesforce solutions. Check out all of the resources that this team maintains on the Architect Core Resources page of Developer Force.