Data sharing allows developers to share data from Salesforce Data Cloud’s data lake to systems like Snowflake with zero copy data sharing. This approach greatly reduces the time needed for developers to work with Data Cloud data in an external data lake or system.

In this blog post, we’ll discuss how to use data sharing to share data from Data Cloud to Snowflake.

What is Salesforce Data Cloud?

Salesforce Data Cloud revolutionizes how businesses harness their disconnected and trapped data, both structured and unstructured, to provide a comprehensive, actionable 360-degree view of their customers. Serving as a foundational element of the Einstein 1 Platform, Data Cloud facilitates connected experiences for businesses and consumers, leveraging insights and AI powered by metadata and unified data harmonization.

Data Cloud has built-in connectors that bring in data, in batch or streaming mode, from many sources, including Salesforce apps, mobile, web, connected devices, and legacy systems. With an open and extensible architecture, Data Cloud brings together a unified view of data to help your Sales, Service, and Marketing teams build personalized customer experiences, trigger data-driven actions and workflows, and safely drive AI across all of your Salesforce apps.

What is Snowflake?

Snowflake is an advanced data platform that enables data storage, processing, and analytic solutions. It runs completely on cloud infrastructure, with the exception of its optional command-line clients, drivers, and connectors. Snowflake sits on public clouds like Amazon Web Services (AWS), Google Cloud Platform (GCP), and Azure, and it offers native support to many types of data, such as structured and semi-structured.

Developers can move data to Snowflake using any ETL tool supported by the platform. Data analysts and data scientists can also report on data from Snowflake using the many reporting tools that can connect to Snowflake, like Tableau, for example.

Let’s now take a look at how to connect the two platforms and share data from Salesforce Data Cloud to Snowflake.

Connecting and sharing Data Cloud data to Snowflake

Prerequisites

To get started, you’ll need either Account Admin, Sys Admin, or Security Admin permissions to perform these steps. Please note that this functionality is not yet available in the Data Cloud 5-day orgs on Trailhead at the time of writing this post. You can sign up for a 30-day Snowflake trial account.

Step 1: Create the integration user

Start by creating a worksheet in Snowflake by navigating to Projects > Worksheets, and click the + icon to create a new SQL worksheet.

Image of Snowflake Worksheets screen with a list of worksheets

Note: Account Admin, Org Admin, and Security Admin are on the blocked roles list by default for new security integrations. If you want to use one of these roles as your integration user, you will need to reach out to Snowflake support. Otherwise, you will need to use a different role. You can also create a custom role. For the purposes of this demo, we are using the Public role.

On the SQL code worksheet, replace < Data Cloud Admin or Data Aware Specialist > with a name for the user. You’ll then need to enter information for each field to create the user.

Here is an example of what the SQL should look like to create a user:

Image of a worksheet with SQL code for creating a user in Snowflake

Then, press the play button in the top right-hand corner of the worksheet to run the worksheet.

Image of play button highlighted by a red rectangular border

Step 2: Set up OAuth and grant access to the PUBLIC role

Next, create a second SQL worksheet and paste the code block below into your worksheet.

Then, run lines 1-9 to create the security integration by pressing command + return on a Mac or ctrl + enter on a Windows machine.

Image of Snowflake Worksheets screen with SQL code for creating the security integration

Next, run line 11 to generate the client ID and secret by pressing command +return on a Mac or ctrl + enter on a Windows machine. Take note of the client ID and secret as you will need them later.

Image of Snowflake Worksheets screen with worksheets with SQL code for creating the OAuth

Then, run line 13 to get the OAuth endpoint. You should see this in the table in row 9. Note down the endpoint as this is needed later.

Image of Snowflake Worksheets screen with SQL code and the authorization endpoint

Finally, run line 15 to grant the role access to the security integration.

Image of Snowflake Worksheets with SQL code to grant the role access to the security integration

Step 3: Create the data share target

In Salesforce Data Cloud, navigate to Data Share Targets and click New. Then select the Snowflake tile.

Image of New Data Share Target screen with the Snowflake tile selected

Next, enter a label or name for your connection. For account URL, use your OAuth endpoint and remove everything after “.com.” For example, if our URL is “https://www.myurl.com/oauth/token-request,” you’d remove “/oauth/token-request.”

Image of the New Data Share Target creation screen with fields for label, API name, and Account URL

Next, enter your client ID and secret.

Image of the New Data Share Target creation screen with fields for label, API name, and Account URL populated, and Client ID and Client Secret fields blank

You should be redirected to a login page to enter a username and password.

Image of a sign-in screen with username and password populated

You are then redirected to another page, where you’ll need to click Allow to allow Salesforce Data Cloud to access your Snowflake account.

Image of a screen where Snowflake is asking to allow Data Cloud to access the account

You can verify that your data share target was connected successfully by checking that your status is showing as “Active,” and that your authentication status is showing “Successful.”

Image of a data share target with call-outs to the authentication status as “Successful” and data share target status as “Active”

Step 4: Share your Data Cloud data with Snowflake

Now, let’s share our Salesforce Data Cloud data with Snowflake.

First, navigate to Data Shares and click New. Enter a label and choose a data space to associate your data share to. In this case, you’ll just use your default data space.

Image of a New Data Share screen with Label, Name, Data Space, and Description fields

Next, choose a data lake object or data model object to share. In this case, you’ll choose to share a data lake object that is ingesting data from Google Cloud Storage. Finally, click Save.

Image of a list of data lake objects

Next, choose your data share target by clicking the Link/Unlink Data Share Target button.

Image of a created data share

Then, choose your data share target and click Save.

Image of a list of data share targets that are Snowflake environments that can be linked

Step 5: Get the data in Snowflake

In Snowflake, navigate to Data Products, then Private Sharing. Under Direct Shares, you’ll see the data share from Salesforce Data Cloud. Click the down arrow button to get the data.

Image of direct shares under Private Sharing in Snowflake

You can rename the database and choose which roles can access the database.

Image of a screen stating the data to be shared with fields for database name and roles

Click Done or View Database to view the database. Please note that your data might be brought over under a different container.

Image of a Database Created dialog screen

You are then able to see your database under schema and views. You can view your metadata, such as field names and types.

Image of the data lake object in Snowflake under Databases with the views and schema

You can also preview your data in the database.

Image of a list of data with columns for country, email, first name, and gender

Conclusion

You now know how to connect and share your data from Salesforce Data Cloud to Snowflake using the zero copy data sharing approach. This powerful knowledge allows you to bypass time-consuming steps and build a robust integration faster. We recommend taking the Create a Data Stream in Data Cloud project on Trailhead to get hands-on with Data Cloud. We also encourage you to take the Get to Know BYOL Data Sharing in Data Cloud module to learn more about data sharing and the “Bring Your Own Lake” feature in Data Cloud.

Resources

Acknowledgments

Special thanks to Deepak Mooga for his help with the Snowflake commands.

About the Author

Danielle Larregui is a Senior Developer Advocate at Salesforce focusing on the Data Cloud platform. She enjoys learning about cloud technologies, speaking at and attending tech conferences, and engaging with technical communities. You can follow her on X.

Get the latest Salesforce Developer blog posts and podcast episodes via Slack or RSS.

Add to Slack Subscribe to RSS