Light Integration with Salesforce Using Data Loading Tools
Light Integration with Salesforce Using Data Loading Tools

Light Integration with Salesforce Using Data Loading Tools

03/10/2015 by Ashley Leifer
Light integration with other data systems and Salesforce using data loading tools.

Moving data from a separate database to Salesforce can be approached in many different ways, it just depends on what the needs are for tracking that data in Salesforce. There is the one time data dump where you are moving all data from one system to Salesforce and retiring that previous database. There is what I call ‘light integration’ where you periodically move data from one system, say a core banking system, to Salesforce in a somewhat manual basis.  Then there are the more robust integrations with Salesforce that are set up to automatically bring data into Salesforce via ‘batches’ (integration runs at a specified day, time) or real time synchronization where data is flowing one or both ways in real time. These robust integrations don’t include any human interaction for syncing to occur.

For the purposes of this blog post I’m going to focus on the ‘light integration’ process and things to consider when setting up and running the integration. Regardless of what system you’re pulling from and what tool you use, the fundamental steps are extraction, cleanup and mapping, and finally uploading.

Extraction - Pulling Data out of a Database and File Format

The first thing to consider when you’re looking to do light integration with an outside database and Salesforce is what are all the data points you need to extract and store inside Salesforce. Consider what is needed in order for your organization to see a ‘360 view’ of your customers. Extraction of the data really depends on the database you’re working with but you want to pull it to an excel file structured as clean as possible.  For example, don’t have address, city, state, etc… all in one column. Split those out into separate columns as they would need to be pulled into Salesforce that way. Another one is name; extract so that salutation, first, last (and if configured middle and suffix) are in their own columns.  This will save you a lot of time cleaning up your spreadsheet prior to uploading especially if you are doing this in regular intervals such as once a week. Since you will be extracting on a recurring basis make sure you are pulling the data into a spreadsheet consistently, column headers are named the same thing and in the same order each time.

Clean up and Mapping

No matter how cleanly you can extract data, there is always a bit of cleanup to do. For instance, data columns need to be formatted as ‘mm/dd/yyyy’ and currencies should be in a text or general format. Make sure those email addresses follow standard formatting <textwithnospace>@<text>.xxx and websites are www.<text>.xxx (you can have the http:// if you want).  If there’s a way to set these during extraction, great!

The first time I’m setting up a spreadsheet for light integration, I always like to walk through each column in a spreadsheet and plan out how it will be mapped to Salesforce before I do the actual mapping in the tool I’m going to use for uploading. That way I know what objects are involved, how many upload scenarios will need to be created and saved, and what, if any, columns need to be added to the spreadsheet each time I prep for uploading. Columns that need to be added, I find, are typically record types, ownership, reference fields. You can avoid needing to add the reference fields if you have...the ideal...external unique Identifiers. These are identifiers that come from your database that uniquely identify each entity, individual, product or service you may pull in new and/or revise data. Having external unique identifiers listed in your spreadsheet for entities and individuals especially saves a lot of time and eliminates the possibility of adding duplicate records to Salesforce. The less ideal alternative is to create reference field columns for salesforce IDs and run ‘vlookups’ for the ID based on what I would consider softer data in your spreadsheet (for example, an entity name column or an email address for an individual).  This is more prone to errors and potential duplications.

Once you feel your spreadsheet is clean and mapped as needed.  It’s time to utilize a migration tool to do the final mapping scenarios, save those scenarios for future uploads and finally to perform the upload to Salesforce.

Migration Tool (or ETL)

There are many tools to choose from, for light integration you could use the Apex Data Loader but I’ve found it has too many limitations.  Your spreadsheet has to be verbatim what you will map into Salesforce.  There is no flexibility or time saving tricks you can do with this tool. Dataloader.io is another great cloud based tool, it’s free and works with Professional Edition. However, I find it limited in similar ways as the Apex Data Loader. A nice thing about the Dataloader.io in comparrison to the Apex Data Loader worth mentioning; instead of having to vlookup the Salesforce ID for reference fields, it let’s you map the reference fields using a column such as name, essentially doing the vlookup right in the tool for you.

My preference is to use the Jitterbit Cloud Data Loader (yes and it’s free). Not only does it save mapping but also the full upload scenario for each object touched in a recurring import. Once it’s set up, all you would need to do for recurring uploads is point the scenario to the new csv file and hit ‘upsert’ (upsert in the case of recurring data uploading is the best importing option). With Jitterbit you can also build formulas for naming conventions, and if something like a record type or owner is the same for the whole spreadsheet you can build this right into the mapping instead of taking the time each time to add these columns to the spreadsheet. Whatever tool you choose make sure it has the capability and flexibility to do as much of the leg work via mapping so you don’t have to spend too much time cleaning up the spreadsheet, collectively over time this will be a huge time saver.

For light integrations thoroughly document each of these steps as you walk through them the first time.  It’s not only a great reference for the next few times you need to import data to Salesforce (until you have it down to a science) but if anyone else needs to own this responsibility it’s well spelled out for them. Document the tool you use, the cleanup steps you need to take in the spreadsheet, the scenarios saved in your tool, etc…. Once you’ve run through these steps a few times you’ll become a master at it and very efficient.

Do you have your own tips and tricks for light integrations? Other tools you like to use? Please feel free to comment below, on our Facebook page, or directly at me on Twitter @LeiferAshley or in the Success Community.