Blog Posts

Contributing To Open Source Projects As a Learning Tool, Part 1

If you’re interested in learning how to code Apex, Visualforce, and Lightning Components, the many open source projects on the Salesforce platform are a great resource. What is Open Source, and how is it used on the Salesforce platform?
Contributing To Open Source Projects As a Learning Tool, Part 1

Contributing To Open Source Projects As a Learning Tool

A few weeks ago, Amy Bucciferro wrote about different learning styles, and how the tools for learning to use, administer, and develop on Salesforce align with certain learning styles. As a largely self-taught coder, I have found the open source projects that members of the Salesforce community have made available an incredible resource. Reading the code and documentation that others have written, and then eventually submitting documentation and code for inclusion in a project, has helped me gain skills and confidence as a developer.

Since even the clicks-not-code parts of Salesforce apps can be stored and shared as text-based metadata, app builders who use these declarative development tools can also gain valuable insights from, and contribute to, open source apps.

What is open source software?

The Open Source Initiative defines the term as "software that can be freely used, changed, and shared (in modified or unmodified form) by anyone."

This means that the source code — the text files used to create a piece of software — are available for anyone, that anyone is allowed to modify that source code, and, if they like, create a new version of the software program. There are a variety of ways that an open source software project can be organized (or not), and a variety of licenses applied to their code, but the bottom line is that the code is available for you to use as you see fit.

Open source software on the Salesforce platform has a couple of attributes that most other platforms don’t. The big one is that the Salesforce platform itself, which is required to use any code written for it, is not open source, and for any real business use, requires paid (or donated) licenses. Another challenge has to do with how apps are packaged and distributed — there are practical pitfalls to taking an app, modifying the code, and installing that code into your Salesforce instance. If you do that, you can’t easily install upgrades that the original app publisher provides.

Finding and installing open source projects

Despite these limitations, just having access to the source code of a Salesforce app is incredibly empowering. This lets anyone who can read code see exactly how the app functions. As an app builder, you can install it into a developer org to see how the Lightning Processes, Flows and Workflow Rules are set up, without having to read any of the code.

There are two main ways that open source software for Salesforce is shared. The first is as an unmanaged package. To install an unmanaged package into a sandbox or developer org, you just need its install link, similar to most other apps you get from the AppExchange. The difference is that you’ll be able to see and modify everything that the unmanaged package adds to your org. However, that editability comes at a price - you can’t upgrade an unmanaged package in-place; you have to uninstall and then reinstall, and uninstalling deletes all the data in any objects or fields that are part of the package.

Dreamhouse Sample AppA great example of an unmanaged package that you can use for learning is Salesforce’s own Dreamhouse App. The Dreamhouse App was actually built to be a tool for demoing and teaching about Salesforce features, and it has lots of declarative goodness if you’re not interested in the code. Salesforce has released several other free and open source tools as unmanaged packages over the years: Action Plans and Milestones PM are particularly well-known ones, and offer great opportunities for learning some code.

The other main way that open source projects for the Salesforce platform are shared is via source code repository sites. GitHub is far and away the most popular site for this, but sites like BitBucket and GitLab can be used as well. Getting the code and other metadata into an org to see how things actually work takes some additional technical steps in many cases — “cloning” the repository onto your local hard drive, and then using one of the IDEs for Salesforce development or the Force.com Migration Tool to deploy the code and other metadata into an org. Fortunately, Andy Fawcett of Financial Force, who has made a lot of contributions to the Salesforce open source community, has provided a Heroku-based tool that helps you to deploy from a GitHub repository (or “repo”) directly into a Salesforce instance. Many Salesforce projects’ GitHub repos feature this button:

Deploy to SF button

which you can click to be linked to Andy’s Deploy to SF Heroku app and install the project. For those projects that don’t offer the button, you can try this Javascript bookmarklet that I built. When you are on the main page of a GitHub repository, you can use the bookmarklet to pass the repo’s URL to the Deploy to SF app.

In the next post, I’ll dig into some of my favorite open source projects, and what specific lessons can be drawn from them.

Do you have favorite Salesforce open source projects that you’ve learned from? Want more details on how to get started with open source code? Share your stories and questions with me on the Arkus Facebook page, in the comments below, in the Success Community, or to me directly via Twitter at @tet3. You can also see my GitHub profile for repositories I’ve starred and forked for my own future learning opportunities.

Photo credit: "16th st" (CC BY-ND 2.0) by Peter McCarthy

Salesforce Summer 17 Ideas Delivered

Salesforce again delivers on Ideas from the Community in the Summer ‘17 release.
Salesforce Summer 17 Ideas Delivered

Salesforce Summer 17 Ideas Delivered

Though the list of ideas delivered in this release isn’t huge (literally it’s 10 ideas) there are some good ideas delivered this time around.  The point count this time is a total of 33,160, not a lot compared to the last release.  Let’s take a look at the Ideas delivered with this latest release:

Lightning Experience

Access Field History Related Lists in Lightning Experience  - Yeah! Tracking field history on standard and custom objects is now visible in Lightning Experience. If you’ve added the related list in classic page layout you will now see it in Lightning (of course you can now add them to the layout in Lightning as well). This idea was delivered with a total of 5020 points.
 
There Are Lots of Shiny New Features for Streams - If you haven’t paid attention to this, Chatter Streams are custom feeds you create by combining multiple related feeds. You can combine feeds from people, groups, and records. In this release, Salesforce has a new home page (click on the Streams heading from the Chatter home page) for them, actions, filters, and you can create up to 100 Streams. The Idea addressed by this feature had gathered a total of 4500 points.

Both Lightning Experience and Salesforce Classic

Lightning for Gmail: General Availability and Improved Efficiency - Most notable of the ideas delivered affecting both Classic and Lightning all revolves around Lightning for Gmail. Generally available in this release, Lightning for Gmail is a Chrome plug in, but first your Administrator needs to perform a few steps to enable in your Salesforce instance--making sure Enhanced Email is enabled for the organization, contacting Salesforce to get access to Enhanced Email for Lightning for Gmail, and enabling the feature. The combined Ideas (Integrate Gmail with Salesforce, Relate Gmail emails to Salesforce, Add Attachments from Gmail to Salesforce) related to Gmail integration on this release is a total of 7,640 points. Here are some of the features worth mentioning:

  • Relate emails to records, as you are composing the email, by selecting relevant accounts, opportunities, cases, or custom objects. The email will be related to the records selected and to relevant contacts, leads, and users associated. 

  • When you’re ready to compose an email, you can select a Lightning email template, modify, and even add merge fields. This capability is in Lightning Experience only.

  • After the initial setup and authentication, Lightning for Gmail saves the user's credentials, so you don’t have to keep logging in each time you want to work in Gmail.

  • Pick and choose which email attachments you want added to records, a very necessary feature, when there are so many images included in signatures out there!

  • With both Lightning for Gmail and Lightning Sync setup you can have your Events synced between Salesforce and Google Calendar. 

  • Not sure if you’ve added that email in Gmail to a Salesforce record? Lightning for Gmail adds labels, so you can tell which emails are already related. You can filter by these labels and assign color coding to them. 


Do you have other favorite Ideas delivered in this latest release I didn’t mention? Please feel free to comment below, on our Facebook page, or directly at me on Twitter @LeiferAshley or in the Success Community and Power of Us Hub.

Talend & PostgreSQL - Part 2 - Setup

This is a follow up post to part 1 "Talend & PostgreSQL - Data Migration Dream Team"
Talend & PostgreSQL - Part 2 - Setup

Talend & PostgreSQL - Part 2 - Setup

This is part 2 of a blog post about utilizing Talend and a PostgreSQL database to assist with data migrations.  Part 1 outlines the reasons to use an ETL + database approach to migrating data and generally outlines the tools involved.  You’ll also find links to download and install the tools needed to complete the following setup.



Setting Up your database:


Once you’ve installed your tools we’re ready to start setting up connections to our various datasets.  In this case, two Salesforce orgs and a PostgreSQL database.  First, lets add a database and a table to our PostgreSQL server.  Open pgAdmin.  Goto ‘Object’--> Create → Database.  I’ll call mine ‘IDLookup’.  Save.


Next we’ll add a table with some columns.  

Goto ‘Tools’ → ‘Query Tool’   Past in the following:  


CREATE TABLE id_table

(

   "Legacy_ID__c" VARCHAR(18),

   "Id" VARCHAR(18)

)


Press Execute / F5

Screen Shot 2017-04-28 at 3.56.09 PM.png




Executing this script will create a table with two columns ‘Legacy_ID__c’ and ‘Id’.

After refreshing the view:  (‘Object’ → ‘Refresh’)  You’ll see the table show up under ‘your table name’ → ‘Schemas’ → ‘Public’ → ‘Tables’


Screen Shot 2017-04-28 at 3.56.51 PM.png


Note that we just created a table via the query tool.  CUD (create, update, delete) are all possible for tables and records in a SQL database using the query tool.


Connecting Data Sources to Talend

Now that we have a table to connect to in our database we can switch to setting up Talend.  


Once Talend is installed, be sure to install additional packages from the help menu.  Some of these will be needed in order to connect with Salesforce.


Screen Shot 2017-04-28 at 1.21.26 PM.png



We’ll start by creating a connection with the PostgreSQL database.  

In Metadata, create a new Db Connection.


Name the connection and fill in the database details. Click ‘check’ to make sure you’ve successfully connected to the database.  The default port will be 5432.  In order to make sure, execute the following script in the query tool of pgAdmin:


SELECT *

FROM pg_settings

WHERE name = 'port';



Note: mine is running on 5433, so that's what I’ll enter in Talend.


Once you’ve connected to the database you’ll need to retrieve the schema.




Select the table/s to include in your schema.



Click ‘Finish’.










You’ll see the database, table, and columns in your metadata repository.






Next will be to create connections with each of the Salesforce orgs we’ll be working with.

Create a new Salesforce connection from the Metadata section.  




Enter your org credentials and click ‘Test connection’.  If you’re connecting to a sandbox be sure to update the Salesforce url in the ‘Advanced…’  section from: https://www.salesforce.com/services/Soap/u/37.0  to https://test.salesforce.com/services/Soap/u/37.0



Click ‘Next’ and then select all the objects you’ll be exporting/importing to/from the org.










Repeat for each org you’d like to connect for this job.



Now that we have all of our connections available to select from metadata.  We can start adding components to our job.  


The first thing we’ll do is extract all the Account records from our Source org and move them into our destination org.  


Drag the ‘Account’ object from the Source Salesforce Org onto the design screen.  Select the tSalesforceInput component.  From the Palette add a tmap component and finally select the ‘Account’ object from the Destination Salesforce org and select the tSalesforceOutputBulkExec.

Connect the components by dragging the connector from tSalesforceInput to tMap and then from tMap to tSalesforceOutputBulkExec. This should populate the schema from both orgs into the tMap component.  





Double click the tMap and simply drag fields from the source on the left to the destination on the right. Most importantly, map the Id field from the source to an external Id field in the destination org.  In this case, I’m mapping to OldSalesforceID__c.


Formula fields and some audit fields will have to be removed from the destination schema.  Created Date and Created by Id can be set only on an insert and the profile of the credentials used to login through Talend will need to have set Audit field permissions.


Screen Shot 2017-04-28 at 11.55.57 PM.png


Run the job.  


Now we’ll create and run a job to populate both the legacy and new id from the Destination Salesforce org into our PostgreSQL database. This time select a tSalesforce input after dragging the ‘Account’ object from the Destination Salesforce org onto the design screen followed by a tMap into a tPostgresOutput.



Edit the map to put the Id from the account to the Id column of the database schema and the OldSalesforceID__c field into the Legacy_Id column.



Deactivate the previous sub job before running this one.  Run it.  


At this stage, we’ve migrated the accounts from one org to the other and created a map of ids for those accounts in the id_table our PostgreSQL database. When we migrate the contacts we’ll use that map to assign contacts from the source org to the correct account in the destination org.  




Next, we’ll setup the job to migrate the contacts and reference the database. Add both the ‘Contact’ object tSalesforceInput and a tPostgresqlInput  component that references the id_table as inputs to a tMap component. The output will be tSalesforceOutputBulkExec to the ‘Contact’ object from the Salesforce Destination org.



Map the AccountId field to the Legacy_ID__c field of the tPostgresqlInput  then map the Id field of the tPostgresqlInput  to the AccountId field of the output.  Referencing the Id_table in our database is that simple and it can be done as many times as necessary for any number of lookups on the object.  Just add another instance of the tPostgresqlInput for each lookup.  




Run the job and your contacts will insert with the correct account ids.  Since created_by_id and created_date can only be set via insert, this method will allow you to correctly migrate those values.  


This is a very basic example of what can be done using the combination of an ETL and database to assist with data migrations.  Like anything worth learning, it may take some time to wrap your head around the potential benefit of using these tools.  Take your time and make sure you do your learning in a sandbox.  



Have any great ETL learning stories? Share them with me on the Arkus Facebook page, in the comments below, in the Success Community, or to me directly via Twitter at @jpbujold

Arkus Pro Bono Day - Spring 17

Arkus’s Spring 17 Pro Bono Day was all about empowering Admins.
Arkus Pro Bono Day - Spring 17

Arkus Pro Bono Day - Spring 17

As part of Arkus’s commitment to giving back to our communities, we host at least two Pro Bono Days each year, where we invite local nonprofits to join us in the office for learning and free consultations. This spring we focused on tools that admins can use to better support their Salesforce orgs.

Duplicate Management

We started the day with Peter White sharing some great information on managing duplicates in Salesforce. We learned the importance of understanding our data sources and what impact that can have on the likelihood of having duplicate records in a database.

We learned about resources for learning more about duplicates, including Trailhead, and the Salesforce native Duplicate Management setup. We got a live demo of setting up matching rules and duplicate rules, and we talked about the host of third party apps that can help an admin dedupe their existing data.

Duplicate management is an important part of an administrator’s role. Peter also shared some of his tips here on the blog.


Tips and Tricks for Declarative Lookup Rollup Summaries

Following duplicate management, we learned some use cases for the open source tool Declarative Lookup Rollup Summaries (DLRS) from Ashley Leifer. First we learned the difference between Master-Detail relationships and Lookup relationships. Where a Master-Detail relationship allows rollup summary fields natively, lookup relationships don’t. Enter DLRS!

DLRS allows an administrator to declaratively (using clicks, not code) define a trigger that searches for related child records in a lookup relationship and rolls up the information. We learned about some of the coolest features from DLRS, including the ability to roll up text information or formula fields.

Ashley then walked us through the tool with real-world use cases. We saw an example of rolling up Affiliation information to a Contact to see an individual’s previous workplaces. We also saw a custom Expense object rolled up to a Campaign.

Ashley Leifer describes the benefits of DLRS


Introduction to Workbench

We finished the morning with an introduction to Workbench from James Bujold. Like DLRS, Workbench is a Salesforce tool that helps an administrator do even better by their org and their users. Workbench is a series of tools that allow an administrator to dig into their org - both metadata and data.

He walked us through use cases for Workbench, including using the query builder to find recently deleted records and undelete them or finding the source of a flow error by searching through the metadata for the id. Perhaps the highlight for the group was the Password Management utility, which allows an admin to set or reset a user’s password and assign specific text as a new one (temporarily).


James Bujold shares some Workbench use cases

We love sharing knowledge and empowering admins, and our favorite part of these events is sitting down afterwards and talking about the real challenges that our participants face every day. Thank you again to those that joined us, and we look forward to seeing more of you at our next one.


Have more use cases for these tools? Just want to say hi? Follow us on Twitter, on the Salesforce Community, Facebook, or chat with me @thesafinhold.


Permission Sets in a Lightning World

If you know me, you know I love Permission Sets. They are my favorite non end user facing feature on the Salesforce platform. It’s a geek thing I guess, but give me some granular permissions to rollout to specific users, and I’m a happy camper.
Permission Sets in a Lightning World

Permission Sets in a Lightning World

How have they changed since they first rolled out?

The biggest change is actually not to Permission Sets themselves but actually to the platform that they reside on. Salesforce has changed so much since Permission Sets launched about five years ago. There was no concept of Lightning. There was no Einstein. There wasn’t even Salesforce1 Mobile. It was good old Salesforce Classic with “Apps” that were essentially a series of tabs that users could get to through other means anyway. Since Permission Sets were always built API-first, they easily kept up with the fact that Salesforce has increased all of the features and functions that come along with the platform. Virtually any feature can be turned on and off via a Permission Set, including really big changes for end users like the ability to be a Lightning Experience user or not.


Are they any different now in Lightning Experience?

Simple answer, no. Permission Sets have largely stayed the same in terms of their utility. Their main use is to apply a permission, or set of permissions, to individual users to provide them more access to features on the platform. Of course there are some new features in the Summer 17 release that make life a little easier, like Standard Permission Sets. Just how Salesforce encouraged all ISVs to ship their products with Permission Sets, Salesforce is following suit. An example would be the standard Permission Set for Sales Console. If you purchase five Sales Console Licenses, you can simply assign the predefined Permission Set to five users, and they automatically get the license and ability to use the console. Nifty…

The biggest difference is really how an administrator can go about launching Lightning Experience to the organization. This was a large breakthrough back when Chatter rolled out, and it was all or nothing. Salesforce learned from that “mistake” and allowed, via Permission Sets, to roll Lightning Experience out at a pace that is comfortable and manageable.


A Sign of Maturity Early On

Did you need to change any of your legacy Permission Sets because you flipped to Lightning Experience? Another simple answer: no, you didn’t. This is a testament again to the API-first approach of building out Permission Sets. There are a lot of features on the Salesforce platform that are being “Lightningized,” but Permission Sets are not one of them; they just work the way they have always worked. Nothing new to learn, just create and assign ad-hoc permissions as your heart desires (using The Permissioner of course).

What Next (IMHO)?

Moving forward I would love to see some improvements to Permission Sets to keep up with some of the Lightning features, such as component-level permissions, as opposed to page-level permissions. This has always been a bit of a gripe, but you cannot control page layouts via a Permission Set (which I understand from a technical perspective but hey, I’m just a user here, and I want my page and component level permissions in a Permission Set). Imagine a Lightning App Page with six custom components on it. Now imagine being able to control which users see which components on the page based on a Permission Set - that is a pretty custom, tailored experience. The example before could also have major implications for Community rollouts, as Community Templates get more and more popular.

All in all, Permission Sets remain tried and true. Through all the turbulence of migration to Lightning Experience, which is still happening and will continue to happen for at least a few years to come, Permission Sets remain one of the more reliable tools in the shed.

Please feel free to comment below, on the Salesforce Success Community, on our Facebook page, or directly at me on Twitter @JustEdelstein.