Integration can mean a lot of things from a simple custom link to a real time movement of data between two different systems. In the following use cases we are going to look at the integration of banking cores such as Jack Henry, Fiserv and FIS into Salesforce; discussing some of the pitfalls and best practices along the way.
The first area to dig into are the two types of typical integrations. Real time integrations are where the two systems are sharing data as fast as they can. An example of this would be in the account opening process where a bank wants a newly closed won opportunity to send a realtime message to the core banking solution to establish the new financial account. Data items like the account number and customer data coming directly from Salesforce. This could also be done the other way, originating from the banking core to Salesforce, with the limitations usually on the banking cores lack of accessibility to programming interfaces.
The second, and more popular integration type, is in batch. This is where a large volume of data is moved from one system to the other in a scheduled and automated fashion. For example, a bank might want to populate client data from their core into Salesforce so they will build a batch process to extract the data from the core, transform it to fit into the new Salesforce data model and load it using the Salesforce APIs. This would typically be scheduled to happen once or twice a day and while the data is not real time, for a customer relationship management purpose, it is usually sufficient. This process of extracting, transforming and loading (ETL anyone?) brings us to the tools of the trade.
Tons of Tools
Any integration is going to need some tools but picking the best one for the job can be a lot to manage. The first thing to consider is the use of what I would call "flat loaders" such as the Salesforce.com Data Loader or free tools such as Jitterbit. For these, the data coming from the core has to be well organized and singular in nature. For example, each file needs to contain only one type of data (Clients) and not need any transformation (formatting, splitting or other complex manipulation). While most banking cores have tools to get the data out, it is usually not as clean as it needs to be including lots of duplicates and mixed data. This is when you would need a bigger tool (queue JAWS score).
The next step up (and it is a big step) is the Extract Transform Load (ETL) software from such providers as Informatica, Jitterbit, Talend, Actian and Boomi. These are either cloud or on-premise tools that have built in connectors allowing for complex logic such as grabbing a set of files from an SFTP site, sorting, de-duping and transforming the data and then upserting it into Salesforce in a brand new data model. These tools come with drawbacks of extra expense ($500 a month and up) and complexity but will provide better overall support and handling of larger data sets. Not a must, but highly recommended.
New Data Model
The next hurdle is the complex nature of core banking data that is originating from a traditional database schema but needs to end up in Salesforce which only supports some basic database concepts. Two issues to keep in mind is the many-to-many relationship of banking products to customers in that a customer can have many banking products (Savings, Checking, Debit) but those products also can have many different customers attached to them (Signers, Owners, Primary, Secondary, Account Holders, Organizations). Building the correct data model will make any data integration much easier.
The second consideration is complexity of different types of entities such as clients, customers, prospects, organizations, trusts, businesses and households. While some, such as clients seem like easy matches to Salesforce.com Contacts, others, like Trusts or Housesholds, are not as clear cut. Depending on the data source and business objective the answers to some of these questions can be handled at the core or on platform. For example, here at Arkus we have built out a custom householding application that runs on the Salesforce1 platform and does the work of much older, slower, and expensive householding applications. That, though, is a blog post for another time.
Process Flow & Record Ownership
The last consideration is the flow and ownership of the data. Most banking cores don't have a concept of data ownership so matching that up with Salesforce, where every record must be owned by a Salesforce licensed user, can be a challenge. Best practice is to work with the business to figure out where record ownership matters (Opportunities) and where it might not matter at all (Households) than working that into the integration logic.
It is also important to take into consideration the flow of the data from the core into Salesforce and what is editable. In a daily batch example, any data element that is coming from the core should be locked down in Salesforce as any changes will just be overwritten during the next run. The tip here is to use record types and page layouts to control the different fields and when they can and cannot be edited.
There is a lot to consider in doing a core banking integration and while it is certainly not impossible, the more upfront thought and planning that is put into it can lead to a greater level of success.
If you have questions, suggestions or other tips on doing large scale integrations feel free to post them below in comments, in the Success Community, on our Facebook page or to message me directly at @JasonMAtwood.
Getting a Salesforce.com credential provides quite a bit of value to your career. As the certification website announces, “Get cloud-certified. Become indispensable.” However, keeping your credentials does require taking shorter, additional re-certification exams throughout the year. This blog post will cover strategies to re-certify.
A little more about what & why first. As each Salesforce.com certified administrator or developer knows, there are a couple available tracks to follow in certifying with Salesforce.com. These tracks are based on either the Administrator (ADM201) or the Developer (DEV401) certifications. If the prerequisite exam for your advanced certification(s) was ADM201, then you’ll need to make sure you maintain that credential with the release exam announcement to those holding ADM201. The converse is true for DEV401.
The short five to twelve or so question exams are released on the cadence of the Salesforce releases. So when Winter is rolled-out in your instance in Fall, the release exam is soon to follow. (And since Summer ‘14 was recently released, those release exams are now available!) The questions are open-book and you can also exit the exam then re-enter another day if you need more study time.
Just know, if you are both Developer & Administrator certified you will need a release exam six times per year. And if you don’t take your release exam, in a while you will lose your hard-earned credentials. The release exams are intended to ensure that every certified professional stays up to speed on the latest features and functionality. So in that sense, they really are good for you! So why are you hesitating?
1. You have to take both release exams… which is overwhelming.
I’d say the strategy in this case is to find a cadence that works for you. I prefer taking the Developer re-certification first, then taking the Administrator one, as most of the questions in the DEV401 re-certification exams reappear in the ADM201. So, I know that if I don’t do well in the former I likely won’t pass the latter.
However, some I know do the reverse. A friend of mine recently described in front of the entire local user group that he takes the Administrator exam first then the Developer re-certification, since he knows that the Administrator will take him eight minutes and the Developer will take him four. Either way you do it, develop your own strategy & cadence…own it.
2. You didn’t take last release & haven’t caught up yet.
If you go into release exam time without your prior one, you will have to start with the last one and catch up. And you can’t let it go even one more release or you’ll lose your certification. Say you received the Spring 14 recert exam notification, and made a mental note, but didn’t yet take it. Then the Summer 14 notice came out a few weeks back. Now you have until the Winter 14 release to catch up.
Not only that, once you do take Spring 14’s re-certification exam after Summer 14 goes live, you’ll be at risk of answering Spring’s exam with items from Summer’s functionality. Hey, it happens, but should it? So let’s make sure to get on top of this.
Dedicate an hour to watching the release training videos, taking screenshots of pertinent slides, then take the recertification exam. Rinse & repeat for the next release as quickly as possible. You’ll feel much less on edge and stressed. Not to mention that staying current on your release exams also means you get to keep your other advanced Salesforce.com certifications. All hail the certified professional.
3. You think it’s gonna be a piece of cake…but you failed before.
Confidence, good. Speeding through without being thorough, not so good. You get twenty or thirty minutes, it’s open book, so go through the exam questions a couple times. The first time, mark the answers that you’re not 100% confident on with the “mark for review” flag. The next time you go through your exam questions, take time to search for answers in the release notes, review your notes or screenshots from the release videos, and answer the remaining questions. Before you hit submit, take note of any topics or features you still may not feel confident in…and...submit.
If you don’t pass, you know what you need to dig into on review. Take a few days to go over it a little more. Log into your developer instance and play with and test the feature. Then, before they announce any new releases, take your exam again. You’ll be on much better ground.
4. You’re too busy with your “real” work. You’ll get to it later.
If you know Salesforce.com functionality and work with a good chunk of features, it shouldn’t take long. It often takes me less than ten minutes per release exam. Maybe eleven minutes, since I need to log into the certification site, open up the release notes & my screenshots. Just be careful about leaving it too long. As we mentioned before, you risk lapsing in your certification. And can you really do your day job while you study for the full exam again? We didn’t think so.
For the story on this straight from Salesforce.com, download their Release/Maintenance Process + Resources Overview to review their most recent process & resources available. Haven’t yet taken a certification? Check out Go to the Salesforce Certification site for information on certifying, official study guides & to sign up for an exam.
Don’t hesitate to get in touch if you have additional points or questions on re-certifying, find me on Twitter at @SeriouslyKyla or comment on the Salesforce Success Community, our Facebook page or comment below.
The Nonprofit Starter Pack historically is a free set of managed packages offered by the Foundation in an effort to get nonprofits started on the Salesforce platform and to better fit individual donors and membership. Salesforce.com is releasing the new version of their Nonprofit Starter Pack, version 3.0 (also known as the development title ‘Cumulus’) within the next month or so (no exact date has been set) and it will be free. Currently it’s still in an ‘extended’ pilot stage and you can request to be part of the pilot through the Power of Us Hub.
Some of the main goals for this release are to fix the general structure of the package along with providing an application that’s easier to configure and use. They’ve also consolidated what used to be six packages into one package that needs to be installed, and (this is really great)…Salesforce.com delivers all future upgrades automatically.
Installation is highly recommended in a sandbox organization first as customization can interfere with the upgrade. Once you install, go ahead and run the Health Check. It's a great tool that’s included to check for inconsistencies and potentially conflicting customization against the NPSP 3.0 upgrade. If you get errors instructions are provided to easily fix them.
Here are a few of the more notable features and changes coming with this release:
New Household Account Model
Many nonprofits are already using this model but through means of 3rd party applications or custom development. What’s created is a Household record type on the Account object to support a more organization/donor model. Inside the settings tab you also have the ability to customize the household naming convention which is nice in case you have multiple households with similar names.
A primary contact field is created to identify the point of contact for the household. If you’re using the Household Account model, this contact is used as the default for the primary donor on the account.
This is a lookup field to Affiliations on the Contact object. Likewise there is a ‘Primary’ checkbox on the Affiliation record. It’s main purpose is to identify the main association between contacts and accounts. The checkbox and lookup fields work together in that if a contact is changed on the lookup, the affiliation record will appropriately adjust the checkbox for the new primary contact.
Addresses & SmartyStreet
Using the Household Account model, all addresses are stored at the Account level. The one selected as the primary address will be used on the individual contact records associated with the household. There is, however, an option to override this on the contact record by setting a different address using the new ‘Address Override’ lookup field. This new address will copy into their Mailing Address fields and won't change if the household address changes at the account level.
Seasonal addresses are also new. This allows for different addresses to be identified as primary based on a duration set for that address. You know, for those scenarios where you have a winter house in Hawaii and a summer house in Washington.
SmartyStreet is worth mentioning, it’s built-in API that can verify and supplement address information. Helpful when you know the address and city but not the zip code.
Centralized NPSP Settings
Finally, for enhanced ease of use and efficiency, is the change to a single location for setting all the NPSP items. They are grouped by function and easy to navigate through (household name customization resides here).
I couldn't end this of course without mentioning mobile; NPSP 3.0 fully supports Salesforce1. For more details, check out the Power of Us Hub online community (simply use your existing Salesforce.com credentials to login). There are forums dedicated to this topic as well as the full NPSP 3.0 documentation.
Have your own thoughts on the new Nonprofit Starter Pack; likes and dislikes? Please feel free to comment below, on our Facebook page, or directly at me on Twitter @LeiferAshley or in the Power of Us Hub or Success Community.
Currently, it is not possible to report on attachments within Salesforce directly so it has been a challenge in the past to access information about attachments that exists within the Salesforce org. As an organization, you may want to track what types of attachments are being utilized and on which records. Data loaders not only allow you to see all of the information about the attachment (name, description, created data, etc.) but will actually extract the attachment itself. You will end up with a zip file of all of the attachments that you want to see or report on within your Salesforce org. I will walk you through one of my favorite data loaders and show you how it can help you access your attachments.
This data loader can be used by any Salesforce org, including Group and Professional (API is not needed). It works very smoothly and quickly for importing and exporting data and the same is true for exporting attachments; the process only takes a few steps. It will give you a report with the attachment fields that you selected as well as the actual attachments within that report all neatly packaged in a zip file.
Accessing your Attachment Data
Once you have logged into your Salesforce org via the dataloader.io, you can select Task → Export → Attachments → Select Fields → Add Filter (optional) → Save & Run. It’s really that easy.
After you successfully run it, you see a link with your attachments that you can click.
This will download as a zip file and when you open the file you will see your attachment report spreadsheet and all of the attachments.
Reporting on Your Attachment Data
When you open the attachment export spreadsheet you will see that it provides all of the fields you selected, which you can then use for reporting. You can filter and sort by many of the fields such as created date, IDs for users who created it, and IDs for the parent record that the attachment is related to in Salesforce.
Note: The body field is the one designated for the actual attachment so that field will always be empty on the spreadsheet and can just be removed after the export (already removed in image above).
Since the export only provides IDs, as you may have already noticed, here’s a tip for interpreting IDs that will help decipher what object the Parent ID is referring to in the report. The 1st, 2nd or 3rd characters of a Salesforce ID will tell what object it is. For example, Accounts begin with 001, Contacts begin with 003, Opportunities begin with 006, and Users begin with 005. Custom objects are slightly different and begin with a0. Then the next two characters for your custom objects will let you know which custom object it is.
With this information in hand, you can use a formula or filter from the Parent ID column to see all of the attachments for a particular object. For example, you can filter Parent ID to show all IDs containing 003, which will be all of your Contacts. This is a quick and easy way to isolate your attachment groups. That’s pretty much it; attachment reporting made easy.
If you have never ventured into getting attachment data, you will find the process is not as complicated as it may first seem. With access to great tools like the dataloader.io, you can gather this information within minutes and now you have some tips that will hopefully give you ways to also easily report on your data.
If you have a great way to access and report on attachments and would like to share your story, please feel free to comment below, on our Facebook page, or directly at me on Twitter @sylviacabral44 on the Salesforce Success Community.
The Salesforce Success Community is an online collaboration hub where Salesforce customers can learn, get answers to questions and share new ideas on all things Salesforce and beyond.
One topic of great importance to me other then Salesforce is wellness. I have had a long battle with anxiety since I was a kid along with other health issues that I am proud to say I have overcome. I learned a lot on my journey and I was looking for ways to share with others some of the resources and ideas I found to be useful motivation tools. After seeing Salesforce dedicating a whole day to health and wellness at Dreamforce it inspired me to create a group in the community called WELLforce. The goal of this blog post is to make you aware of a few groups focused on wellness and to hopefully inspire you to come up with other topics you are passionate about and start sharing.
The Health and Weight Loss Challenge started by Bill Greenhaw was created to hold him accountable for losing a few pounds before Dreamforce 2013. It got an amazing response and grew to over 200 members. Sponsors got involved and people were given some real nice prizes for what they accomplished leading up to Dreamforce. How amazing is it that one man’s goal led to a movement of over 200 people who got healthier in some way shape or form leading up to Dreamforce 2013? This year it has almost doubled in size and people use the group to share all kinds of things health-related from recipes to fitness programs. I have also seen it spark conversations about other types of challenges and a specific Dreamforce 5K (this 5K also happened last year).
I have studied the topic of wellness for many years now and much of what I learned I have applied to my own life. I started this Success Community group after Dreamforce 2013 with the inspiration to spread the knowledge I had gained from my journey and to connect with others who were looking for better wellness in their life. Others and I share motivational quotes, book reviews, and resources to help with nutrition, stress, and pretty much anything else that can better your health mentally and physically. Hopefully with this post we can get more of you to join and share. (wink wink)
Running for Success, formerly the Holiday Running Streak group, started with the goal to help people get through the busy and very stressful holiday season by motivating people to run at least 1 mile a day from Thanksgiving through New Years. It has now moved its focus to motivating runners the entire year and is a great place where people share their running accomplishments. One of things people do in the group that I particularly like is sharing a photo. This makes the collaboration more personal and gives a quick view in the lives of the people we are sharing with. If you are a runner and could use a little motivation (or looking for some competition), I suggest you check it out.
In addition to spreading the word about some good groups you can check out in the success community, I hope you realize the opportunity the community provides. When people have something in common and have a place to gather to collaborate, it becomes a breeding ground for education, impactful change personally, and professionally and innovation. Just look at what happened when one person wanted to lose some weight and decided to share his goal with his fellow Salesforce mates. I encourage you to join some groups and if you feel like a group is missing, by all means go create one!