Set Up a Databricks Data Federation Connection
Set up a connection between AWS or Azure hosted Databricks and Data Cloud to access data.
User Permissions Needed | Â |
---|---|
To create a connection in Data Cloud: | System Admin |
Before you begin:
- Review Data Cloud IP Allowlist, and update your Databricks allowlists.
- Databricks clusters must use Databricks Runtime 13.1 or later and in shared or single-user access mode.
- Make sure these compute and workspace requirements in Databricks are met: .
- Workspace requirements - Query data on Databricks from Salesforce: None, Unity Catalog recommended.
- Compute requirements:
- Network connectivity from your Databricks Runtime cluster or SQL warehouse to Salesforce. For more information, see Networking recommendations for Lakehouse Federation.
- Databricks clusters must use Databricks Runtime 13.1 or above and shared or single-user access mode.
- SQL warehouses must be Pro or Serverless.
- To set up connectivity for data federation, the account admin must have a Workspace admin role.
- The Workspace admin must complete these setup tasks.
-
In Data Cloud, go to Data Cloud Setup.
-
Under Configuration, select Connectors.
-
Click New.
-
Under Source, select Databricks and click Next.
-
Enter a connection name and connection API name.
-
Enter the authentication and connection details.
|Authentication Method|There are two options to authenticate. You can either provide Username and Password or Client ID and Client Secret as the Authentication Method.| |Username|Contact your Databricks admin to obtain the Username and Password for Databricks.| |Password| |Client ID|Contact your Databricks admin to obtain the Azure client ID and client secret for Databricks.| |Client Secret| |Connection URL|The server hostname of the SQL warehouse along with the port number. For example: adb-8903155206260665.5.azuredatabricks.net:443| |Http Path|HTTP path value of the SQL warehouse in Databricks.|
-
To review your configuration, click Test Connection.
-
Click Save.