We launched Salesforce Functions last fall and the response so far has been terrific. While the most obvious use cases for functions are stateless processing of data, there are many examples of business processes that can take advantage of the simplified operating model of functions, but require some persistent state to span function invocations.

Today, we’re happy to tell you that we’ve added a new feature that enables stateful function invocation using Heroku Data products. It’s a simple feature that lets your functions securely access Heroku Data products, including Heroku Postgres, Heroku Kafka, and Heroku Redis directly from your function.

Access to Heroku Data is enabled through collaboration between your Salesforce org and a Heroku account. It’s easy to enable collaboration and Functions developers can access data stores running in Heroku by adding a Heroku account as a collaborator:

sf env compute collaborator add --heroku-user username@example.com

The Heroku account can then share the data store with a Functions compute environment. Simply get the name of the compute environment you want to give access to, then attach the data store to the environment.

Get the name of the compute environment from the sf cli:

sf env list

Then attach it:

heroku addons:attach <example-postgres-database> --app <example-compute-environment-name>

This currently works only for data stores running in the Common Runtime, for example Standard and Premium Postgres plans. We hope to expand this to allow existing private data stores to be securely exposed to Functions. If you are new to functions, see Get Started with Salesforce Functions for an overview and quick start.

Connecting Heroku Data and Functions opens up many new use cases:

  • Create a function to easily iterate across data in Heroku Postgres, including data managed by Heroku Connect.
  • Produce messages into an Apache Kafka on Heroku stream, making it easier to deploy Apache Kafka on Heroku as an orchestration layer for microservices on the Heroku platform.
  • Sharing a job queue or cache based on Heroku Redis.

We can’t wait to hear your feedback.


Sign-up for a free trial to explore more about Salesforce Functions.

Developer advocate Julián Duque will be diving into this subject in greater detail at TrailblazerDX next month. In this session, Julián will be demonstrating how to access PostgreSQL, Redis, and Apache Kafka from a Function. To join us at TDX ’22, register here and take a look at all of the sessions we have planned.

This blog was originally published on the Heroku blog. Check out the latest Heroku news here. 

Get the latest Salesforce Developer blog posts and podcast episodes via Slack or RSS.

Add to Slack Subscribe to RSS