As part of the Spring ’24 release, we are excited to launch Einstein 1 Studio Model Builder in Salesforce Data Cloud. Einstein 1 Studio allows admins, developers, and data professionals to build, connect, and manage their predictive and generative AI models. Einstein 1 Studio provides a unified model management platform in Salesforce for multiple types of models, including no-code builder predictive models, Bring Your Own predictive models (BYOM), and Bring Your Own large language models (BYO LLM). These models can then be seamlessly integrated into any Salesforce Customer 360 application for intelligent decision-making.
Bring Your Own LLM (BYO LLM) in Einstein 1 Studio is a new capability that allows your admins, data scientists, and AI specialists to connect externally-hosted large language models to Salesforce, which can then be used within custom Prompt Builder templates. This empowers development teams with the choice of using an existing agreement with Microsoft, Google, AWS, or OpenAI to easily connect an LLM and use it for GenAI prompts within Salesforce. The model that you connect can be one of the standard versions, or importantly, if your provider supports fine-tuning, you can connect those customized models
What can you do with BYO LLM?
Once you connect an external foundation model as a source for your generative AI, you can adapt it across a wide variety of generative AI use cases. With Einstein 1 Studio, you can easily configure new models and test prompts in a playground environment before deploying a model to production.
After you have connected and validated your external LLM, you can use Salesforce Prompt Builder to create a wide variety of generative applications for your Salesforce users. Prompt Builder also allows you to ground responses on your CRM data, and integrate Flows, APEX, and much more.
Because Prompt Builder is the key consumer of BYO LLM — and it’s also where the critical business value is delivered to your users — we highly recommend that you check out this comprehensive Ultimate Guide to Prompt Builder. And while you’re at it, check out all the ways to integrate Prompt Builder with Invocable Actions, Apex, API, and more in the documentation.
Key terms to understand for BYO LLM
- Foundation model: The ‘base’ LLM that you are connecting to.
- Configured model: A virtual instance of the foundation model with a unique name that can have customized hyperparameters.
- Salesforce-enabled models: The standard LLMs that are enabled by Salesforce for use in turnkey generative AI features.
- Custom models: Your organization’s external LLMs that you have connected for use in custom Prompt Builder templates.
- Model Playground: The simple prompt experience inside Einstein 1 Studio, which allows you to create and test configured models. Optionally, you can test and save different hyperparameters for each configured model that can then be used in different prompt templates for specific business use cases.
- Chat completion/inference: The response from your LLM. Typically this is either generation or summarization.
BYO LLM architecture
The external models are connected, configured, and managed within Einstein 1 Studio in Data Cloud. The prompt templates that connect and use your LLM can be placed anywhere in the Salesforce UX that makes sense for your use case.
After you connect an external LLM, prompts that are executed from within the Salesforce environment will make a completion/generation request via API in the same manner as other generative AI applications. All inference requests from your external model are routed through the LLM Gateway and Einstein Trust Layer before surfacing content to your users.
As illustrated in the architecture slide below, the Einstein Trust Layer provides an extra layer of protection and compliance. See our blog post for a deep dive into the Einstein Trust Layer.
The list of Large Language Models that are supported in the Summer ’24 release are listed below.
Open AI
- OpenAI GPT 4
- OpenAI GPT 4 32k
- OpenAI GPT 3.5 Turbo
- OpenAI GPT 3.5 Turbo 16k
Azure Open AI
- AzureOpenAI GPT 4 Turbo
- AzureOpenAI GPT 3.5 Turbo
- AzureOpenAI GPT 3.5 Turbo 16k
Amazon Bedrock
- Anthropic Claude 3 Haiku
- Anthropic Claude 3 Sonnet
- Anthropic Claude 3 Opus
Google Vertex AI
- Gemini Pro 1.5
Requirements for using BYO LLM in a Salesforce org
To enable BYO LLM in your Salesforce org, you will need two key prerequisites:
- Data Cloud must be enabled in your org to access Einstein 1 Studio Model Builder.
- The Einstein for Sales, Service, or Platform add-on SKU is required for the BYO LLM Foundation Model tab to be enabled.
How to configure BYO LLM in your org and use it with Prompt Builder
To get BYO LLM up and running quickly in your org, the full end-to-end setup process takes 10 easy steps. For the walkthrough, we will be using screenshots of an actual BYO LLM deployment specifically with an OpenAI model that has been fine-tuned with fictitious data for a manufacturing company that builds “Flux Capacitors.”
Prerequisites
In order to connect your Large Language Model to Einstein 1 Studio, you will need to gather a few key pieces of information. The configuration parameters you will need to provide vary slightly depending on your provider.
For OpenAI
- Your organization’s secret key
- The version of GPT that you wish to connect
- The model name for fine-tuned models only (see screenshot below)
For Azure OpenAI
- Your organization’s secret key
- The version of GPT that you wish to connect
- The endpoint URL for your model
- The Azure deployment name from the OpenAI Studio.
For Google Gemini on Vertex
- Your organization’s Private Key and ID
- Your organization’s Service Account email
- The endpoint URL for your model
For Anthropic Claude on Bedrock
- Your organization’s Access Key
- Your organization’s Secret Key
- The region for your instance of Bedrock (e.g. us-west-2)
- The version of Anthropic Claude that you wish to connect
Step 1: Add a foundation model from within Data Cloud
Navigate to the Einstein 1 Studio tab in Data Cloud and select the Foundation Models tab. Adding a foundation model establishes an endpoint connection between the external source and Salesforce. Click the Add Foundation Model button.
Step 2: Choose between OpenAI or Azure OpenAI
Select the LLM provider that you wish to connect.
Step 3: Enter the required fields in the model connection details
Enter any name you wish for your foundation model and the required inputs. The screenshot below shows the connection details for an OpenAI fine-tuned model.
Click Connect and then save your foundation model with the Name and Connect button. You can use the same name if you wish.
Step 4: Review the details of your foundation model on the model overview page
After you’ve saved the model, confirm that the details look correct on the model overview screen. Now that you have connected a foundation model, you can configure and test a new “configured model” to be used in prompts in Model Playground. Click the Configure a New Model button.
Step 5: In the model playground, test the connection to your model with a prompt
Enter a simple text prompt and verify that you are receiving an appropriate response from the LLM. Optionally, adjust hyperparameters and validate the response.
Step 6: Save your configured model with an appropriate name that will be displayed in Prompt Builder
Save the configured model with a unique name that helps prompt engineers to identify where they should use it (e.g., Service Agent Configured Model). After naming, click the Create Model button.
Step 7: Navigate and view your newly configured model
Close out of Model Playground and notice that you now have items in the Configured Models tab within your foundation model details. Click on the Configured Models tab.
Step 8: Enter the configured model details page by clicking on the model name
Click directly on the name of your newly configured model to view the details. Within this screen, you will notice a Create Prompt Template button. Select it and a new tab will open and automatically launch the prompt template builder screen.
Step 9: Create a new prompt template
Select the type of prompt template you wish to use, along with the other required fields. Notice that we’ve selected the “Sales Email” template type. But you can use whichever template type that makes sense for your use case.
Note: The Prompt Builder links earlier in this article go into detail about the templates.
Step 10: Select your configured model with the Custom Model Type dropdown
Within the Prompt Template workspace, change the model type drop-down menu from “Standard” to “Custom.” In the Models list, you should see the configured model that you created in Step 6. You can now create, preview, save, and activate your new prompt template for use anywhere in the Salesforce UX — with your LLM!
Notice in the screenshot below that the response we received when previewing our prompt template has generated a simple sales email, however, it also incorporates content and nuanced knowledge from our fine-tuned model that is proprietary to our fictional organization.
Conclusion
Now that you know the process, we encourage you to try connecting your LLM and building some amazing GenAI features with Salesforce Prompt Builder. Your business users in CRM will be quite pleased when you make these highly intelligent prompts available to them for use in their daily tasks!
And there you have it: generative AI with BYO LLM, integrated and grounded on Salesforce data, and surfaced with Prompt Builder. Cheers!
Additional resources
- Learn more about Einstein Copilot
- Learn more about Prompt Builder
- BYO LLM Spring ’24 Release Notes
About the author
Darvish Shadravan is a Director of Product Management at Salesforce, where he has been employed for more than a decade. For most of that tenure, he has been focused on building AI products and helping customers with their machine learning efforts. Darvish’s current interest is enabling AI/ML professionals with maximum flexibility for generative AI applications and advanced data science capabilities in Salesforce. Before Salesforce, Darvish spent more than a decade at Microsoft in a variety of technical sales roles, and he has a Master’s degree in Data Science from the University of Wisconsin. You can connect with him via LinkedIn.