This section explains the basic developer workflow when developing a new function. This section also details how to add build and run-time configurations for your Functions.
To create a function, start by creating a Salesforce DX project.
The DX project name you choose represents the Salesforce Functions project name, which you use when invoking a deployed function.
Next, use the
myfunction, use the following command from your DX project root directory:
The function name must be only lowercase letters or numbers, and must start with a lowercase letter.
The only supported languages for the
Creating a function adds the
Create a Function Using VS Code
To create a Function using VS Code with Salesforce Extensions for VS Code installed, first create a DX project using the SFDX: Create a New Project command palette command. Use the "Standard" configuration for Function development.
For a step-by-step guide to creating a Functions project, see Quick Start.
Each function must provide a list of dependencies that the function must have to build and run. Function dependency information is provided in language-specific configuration files.
package.json file in the
package.json file, see https://nodejs.dev/learn/the-package-json-guide.
npm install with the Node.js package.json is recommended when you start developing a function, which installs dependencies to enable things like type support for TypeScript Functions code in VS Code. You can also use
npm build to verify you've added the right set of dependencies.
Generated TypeScript functions include the Salesforce Function SDK for Node.js in the
Set Java Dependencies
Dependencies are defined in your function's
pom.xml file for use with the Maven build tool. The following
pom.xml file includes the Salesforce SDK for Java Functions in the list of dependencies:
For more information on Maven and
pom.xml files, see https://maven.apache.org/.
Each function must provide a
project.toml TOML file that contains function metadata. This file usually resides in your
<project root>/functions/<function name> directory. The
generate:function command creates a template
project.toml file that looks something like this:
salesforce-api-version reflects the current version of the Salesforce REST API that the Functions SDK uses. Salesforce Functions supports API version
53.0 or later.
Update this TOML file with any Function metadata information you need. For a list of the valid fields for
project.toml see Function Metadata TOML Files. For details on the general TOML file format, see toml.io.
In your function code, import the Salesforce Functions SDK for your programming language.
When you generate a function with
sf generate function, these dependencies are included in the example code.
In TypeScript, include the Salesforce Functions SDK for Node.js with
Use Java SDK Import
In Java, import classes that you need from the com.salesforce.functions.jvm.sdk package, for example:
Functions that access Salesforce data must have a specific code entry point that can be invoked with the invoking org’s context data and payload.
execute entry point function:
The following TypeScript example provides an
execute entry point function:
Specify Java Entry Point
Java Functions must provide a public class that implements the
SalesforceFunction interface, and overrides the public
apply() method. The following example provides an implementation of
FunctionOutput classes defined elsewhere in the function project code:
The types supported for the input and output for SalesforceFunction are specific to the Java SDK and are described in Java Functions.
Functions require that source code is tracked with git. When developing functions, add your project to a GitHub repo and push your function code changes regularly to collaborate with other developers. Functions code, unlike Apex code, doesn't get deployed to your org, so you can't use your orgs as a way to share code.
Function code must be committed to git before you can deploy a function. However, merging your function code to a github.com remote repo isn't required, although generally a good practice.
To add your project to GitHub, navigate to github.com in your browser. Log in to your github.com account, and create a repository. Save the git URL for your new repo. See Create a Repo for more details.
In the DX project root directory, use the following git commands:
Connect to additional Salesforce resources using the Salesforce Functions SDK for your language (Node.js or Java). The Salesforce Functions SDKs provide an integrated programming model for writing business logic that connects with your data in the Salesforce Customer 360 platform.
Use Context and DataApi
The SDKs provide context data for the calling org when your function is invoked. The data is passed as a parameter to your Function entry point. Through this context you can query and execute DML on your org data. Record access is controlled using the Functions permission set in your org.
context data you can access the DataApi interface to query, insert, and update records.
context.org.dataApi to make a simple query to the org that invoked the function:
The following Java example uses the DataApi class from the context data to do a query:
Discover the UnitOfWork Class
For more complex access, such as complex or large transactions, the Salesforce Functions SDKs provide the
UnitOfWork class. A
UnitOfWork represents a set of one or more Salesforce operations that must be done as a single atomic operation. Single atomic operations reduce the number of requests back to the org, and is more efficient when working with larger data volumes.
UnitOfWork also lets you manage data operations in your own transactions.
UnitOfWork to create an Account record and related records:
When using the Node.js SDK only, always use a new instance of UnitOfWork for each transaction and never reuse a committed UnitOfWork.
The following Java example (from the Context_UnitOfWork_Java sample) uses the UnitOfWork class from the Salesforce SDK for Java Functions (with
Output defined elsewhere in the function code):
UnitOfWork uses the Composite Graph API for efficient transaction requests with higher record limits. For more details on the Composite Graph API see: REST API Developer Guide: Composite Graphs. Note the (Composite Graph API limits)[https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_composite_graph_limits.htm], such as a maximum of 15 different nodes or objects or both in one payload, also apply to UnitOfWork.
Use Salesforce APIs
If the provided SDK classes don't give you the data access you need, you can try making REST API calls directly to the calling org.
In the Salesforce SDK for Node.js Functions, you can use
context.org.dataApi.accessToken to obtain the API access token for the invoking org. This token can be used with your preferred HTTP request framework to make REST API calls back to the invoking org. You can also use the token to initialize a JSForce connection to access these APIs:
In the Salesforce SDK for Java Functions, you can use
DataApi.getAccessToken() to obtain the API access token for the invoking org. Use your APi access token with your preferred HTTP request framework to make REST API calls back to the invoking org.
Be Aware of API Limits
Most org access from functions is similar to an API request to an org through something like the Salesforce REST API, and has similar limits. See Limits.
Use Functions Buildpacks and Runtime Environment
project:deploy use a specific set of buildpacks to build the container image for your Function.
Deployed functions currently run in the Heroku-20 environment.
Use Heroku Data in Functions
Access a Heroku data store from within your function by adding a Heroku user as a collaborator to your function's compute environment.
Add a collaborator using the
sf env compute collaborator add command with the Heroku user's email. Make sure you're logged into Salesforce Functions, then run:
This command triggers an email to the Heroku user letting them know you've added them as a collaborator. After running this command, the Heroku user can then attach your function's compute environment to a data store as an add-on using the Heroku CLI. Use
sf env list to get a list of compute environments, then run
heroku addons:attach to attach the environment: