Add the #DF24 Developer Keynote to your agenda. Join us in-person on 9/18 at 2:30 p.m. PT or on Salesforce+ at 5 p.m. PT for the must-see session built just for developers.

Zero copy data federation, a Salesforce Data Cloud capability, unifies Salesforce and Snowflake data through a point-and-click interface. It provides secure access to Snowflake data while eliminating the need to extract, transform, and load data. This eliminates overhead and ensures access to current information while querying records from Snowflake tables. In this post, we’ll walk through how to use zero copy data federation between Snowflake and Salesforce Data Cloud.

In a previous post, we focused on sharing data from Data Cloud to Snowflake, which is one direction of the integration. However, zero copy data federation allows you to share your data stored in Snowflake to Data Cloud. This tutorial will walk you through the end-to-end steps. These steps may not encompass all best practices. Please use the best practices outlined by both Snowflake and Salesforce for creating secure integrations. This post is a general guide and example.

Connect and share Snowflake data to Salesforce Data Cloud

Snowflake is a popular data warehouse that is very scalable and easily stores large amounts of data. Snowflake is used by a wide variety of customers for storing large volumes of data. A customer may want to leverage their data stored in Snowflake with some of Data Cloud’s activations, such as Marketing Cloud Journeys for text messaging and emails. They may also want to use their data from Snowflake on Salesforce objects and related lists using Data Cloud enrichments. Bringing in Snowflake data to Salesforce Data Cloud unlocks new opportunities for the data.

Let’s take a look at how to use zero copy data federation to share data stored in Snowflake to Data Cloud.

Step 1: Create a warehouse

In Snowflake, navigate to Admin, then Warehouses, and click the + Warehouse button in the top right-hand corner.

Snowflake Admin page with the default COMPUTE_WH and a newly created DATA_CLOUD_WAREHOUSE

Next, enter a name for your warehouse, and choose the type and size of warehouse that fits your use case.

New warehouse configuration modal with the type set to Standard and size set to X-Small

After you create the warehouse, you’ll need to transfer ownership of the warehouse to a non-administrative role, so that Data Cloud can access it. Select the three ellipses next to the Data Cloud warehouse that you want Data Cloud to be able to access, and select Transfer Ownership.

The Warehouses page shows the Menu button on the DATA_CLOUD_WAREHOUSE row clicked. The menu has options to Edit, Suspend, Drop, or Transfer Ownership of the Warehouse.

In the Transfer ownership dialog box, select the role to which you would like to transfer ownership. In this example, we’re going to select the PUBLIC role to transfer ownership to.

Transfer Ownership modal for the DATA_CLOUD_WAREHOUSE, where the “Transfer to“ drop-down menu is selected and lists default roles

Step 2: Generate a public-private key pair

Next, you’ll need to create a public and private key pair, which will be used to make the connection between Salesforce Data Cloud and Snowflake.

Note: Salesforce Data Cloud currently does not support encrypted public keys.

Generate a public-private key pair with JWT.io

You can create a public-private key pair using the terminal or with JWT.io, which makes creating a public and private key pair very simple. Change the algorithm to RS256. You can keep the payload as it is. And voila! A public and private key is generated.

The JWT.io web app screen shows an encoded and decoded JSON Web Token. The public-private key pair is used to sign and verify and sign the signature.

Generate a public-private key pair with the command line

You can also generate a public-private key pair using the command line.

First, open your terminal on a Mac and enter the following command to generate a private key using the RSA algorithm. Follow the equivalent commands for a PC, or use the JWT.io app mentioned above.

code rsa_key.p8

You can open your private key in a code editor like VS Code.

A RSA256 private key that has been hidden via pixelation

Next, to get the public key, use the following command in your terminal.

openssl rsa -in rsa_key.p8 -pubout -out rsa_key.pub

To open the public key, run the following command in your terminal and choose the code editor that you would like to use to open the public key.

code rsa_key.pub

A RSA256 public key that has been hidden via pixelation

Step 3: Create an integration user in Snowflake

Now that we have our public and private key pair, we need to create an integration user in Snowflake, and assign them the public key.

In Snowflake, navigate to Projects then Worksheets. Select the + Icon in the top right-hand corner and choose SQL Worksheet.

Image of Worksheets with the plus sign to create a new SQL worksheet

Enter the information for the integration user. This user will need administrative privileges, so assign them the System Administrator role.

Your SQL should look something like this:

A SQL query to create a user for the Snowflake - Data Cloud integration and set its RSA_PUBLIC_KEY

You can use command + shift + return on a Mac or control + shift + return on a PC to run the entire block of code. You can also press the play button in the top right-hand corner. To run a single line or block, put your pointer on the line that you want to run, then press command + return.

The “describe user” statement will then allow you to verify that the public key is successfully assigned to the user. You should see this in row 24.

The result of a SQL query to retrieve the Snowflake - Data Cloud integration user that lists the RSA_PUBLIC_KEY value

Step 4: Connect Data Cloud to Snowflake

In Salesforce, navigate to Data Cloud Setup, then to Snowflake. Click New.

Snowflake integration page in Data Cloud Setup

Enter a Connection Name and Connection API Name (you can keep the default). Enter your Snowflake account URL. Then enter the username of your integration user and the private key.

Note: Do not include the header or footer of the private key. Only paste the key.

Connection Details page configured to use the Snowflake - Data Cloud user

You can then choose a warehouse.

New Connection page listing the DATA_CLOUD_WAREHOUSE created in Snowflake

You’ll know that your connection was successful when you see your connection status as active.

Snowflake Setup page with an active connection to Snowflake

To start ingesting data from Snowflake into Data Cloud, navigate to the Data Cloud application using your app launcher. Then click the Data Streams tab. Click the New button. You will now see that you have a Snowflake tile available. Click the Snowflake tile.

New Data Stream page with the Snowflake tile selected

Choose your connection and your database. Then, you will be able to see the schema of your connected Snowflake org in Data Cloud.

New Data Stream page for the Snowflake integration listing data warehouse’s available schemas.

Conclusion

In this post, you learned how to create a warehouse in Snowflake, create an integration user in Snowflake, generate a public and private key pair, and configure Salesforce Data Cloud to connect to Snowflake. Now that you know how to connect Data Cloud to Snowflake, you can start sharing your data from Snowflake to Data Cloud. Learn more about Snowflake by signing up for a 30-day trial org and trying one of their numerous tutorials.

Resources

Trailhead: Get to Know BYOL Data Sharing in Data Cloud
Help Documentation: Share Data with Snowflake Using Zero Copy Integration

About the authors

Danielle Larregui is a Senior Developer Advocate covering the Data Cloud platform at Salesforce. She enjoys learning about cloud technologies, speaking at and attending tech conferences, and engaging with technical communities. You can follow her on X.

Charles Watkins is a Developer Advocate at Salesforce and a full-stack software developer focused on the core Salesforce Platform. You can find him on LinkedIn.

Get the latest Salesforce Developer blog posts and podcast episodes via Slack or RSS.

Add to Slack Subscribe to RSS