Data Cloud-triggered flows allow you to automate actions based on near real-time data changes occurring within Salesforce Data Cloud. This near real-time data integration allows you to respond to changes in your data as they happen, ensuring that your workflows and processes are always in sync with the latest information. In addition, it offers automation of processes like updating records, sending notifications, creating tasks, or even integrating with other systems based on specific data triggers. It also provides a seamless integration with Salesforce tools, a user-friendly interface, scalability for large data volumes, and an event-driven architecture for efficient and responsive workflows.

Invocable actions in Salesforce Flow are reusable actions that can be invoked from within a flow. These can be pre-built or developers can create custom invocable actions using Apex, allowing for more complex or specific functionality that can then be used in flows. The advantages of using invocable actions include a simplified flow design, which reduces the need for complex logic within the flow itself. They also facilitate integration with external systems and services by providing actions that can make API calls or perform other integration tasks.

The brilliance of creating an invocable action is that it can be reused anywhere within flow, so if your Data Cloud-triggered flow needs to perform the same action when triggered off different DMOs, this allows you to reuse the custom invocable action you created.

In this blog post, we’ll look at how to use Data Cloud-triggered flows with invocable actions to boost efficiency with Data Cloud automation.

Steps to build an automated workflow in Data Cloud

Imagine a scenario where a customer applies for a credit card online. The application data enters a data model object (DMO) in Data Cloud, triggering a Data Cloud-triggered flow. This flow invokes an action to check the customer’s credit score and writes that information back to Sales Cloud.

Image showing the overall flow of data into and out of Data Cloud

Let’s take a look at the steps involved.

Step 1: Define data needed

Before you ingest data into Data Cloud using the Ingestion API, it’s important to decide which fields are needed to support your use case. In this instance, your object needs to support all the fields tied to a credit card.

Image showing the fields that are part of the credit card object

Once you’ve decided on the fields that are needed to support your use case, prior to using the ingestion API, you need to follow the Ingestion API Schema File requirements to successfully build a file to bring data into Data Cloud. Here is the sample YAML to support the object defined above:

The code that drives the sample application for our use case here is available on GitHub. The object design and YAML are tied to the POST method in the sample application.

Step 2: Ingest and map data

Next, set up a data stream that ingests the credit card data into Data Cloud. Map the incoming data lake object (DLO) to a data model object (DMO).

Image showing the credit_card_application DLO mapped to a custom credit_card_application DMO

Step 3: Create an invocable action

Next, create an Apex class for the data action, navigate to Setup > Developer Console and click File > New > Apex Class. Enter a name for the Apex class — in this case, we’re using “CreditCheck.” In the custom code below, we’ve created an endpoint that makes an HTTP call to a credit agency, that then returns the score.

Next, create an Apex class for the data action, navigate to Setup > Developer Console and click File > New > Apex Class. Enter a name for the Apex class — in this case, we’re using “CreditCheck.” In the custom code below, we’ve created an endpoint that makes an HTTP call to a credit agency, that then returns the score.

Step 4: Create a platform event

Next, create a platform event to support the request for a credit check. A platform event is raised every time there is a request for a credit check. This platform event has two fields, one that captures the SSN and the second a Lead ID, which will then be used to update an existing Lead record. Both of these fields will be referenced in the platform event-triggered flow.

Image showing how the platform event should be set up

Step 5: Create a platform event-triggered flow

Next, create a platform event triggered flow that is triggered every time a platform event is created to request a credit check. This flow will use the Apex class to retrieve the credit score, and then update the Lead record with the credit score that is returned from the Apex class.

Configure the flow to be triggered when a Credit Card Application platform event is received.

Image showing how the flow is triggered

Create a variable called vCreditScore to store the credit score returned by the Apex class.

Image showing how the variable is created

Add the action you created using the Apex class in Step 2.

Image showing the platform event to select

In the action, select the SSN being passed in as the parameter.

Image showing the field to select to pass to the Apex action

Set the output from the Apex action.

Image showing where the output of the Apex call should be stored

Update the Lead record using the Lead ID that triggered the flow, and update the credit score with the value returned from the invocable action.

Image showing the lead record being updated with the returned credit score

Finally, this is what the completed flow should look like:

 Image showing the platform event-triggered flow

Step 6: Create a Data Cloud-triggered flow

Next, create a Data Cloud-triggered flow that is triggered every time a row is added to the Credit Card Application DMO that was mapped in the first step. This flow will verify if consent is available. If consent is specified, then a Lead record will be created and a call will be made to retrieve the credit score with the LeadID as a parameter. If consent is not available, then the flow will exit.

Configure the flow to be triggered when a record is added to the credit_card_application DMO.

Image showing the flow trigger criteria

Create a variable called vLeadId to store the ID from the newly created Lead record.

Image showing the variable being created

Create a decision split to see if a credit check has been requested. This decision is driven off a field in the credit_card_application DMO.

Image showing the decision tree that checks to see if a credit check is required

If the decision is true, then use an action to create a Lead record and save the Lead Id of the newly created record to the variable vLeadID.

 Image showing the action to create a lead record

Create a Credit Card Application event, passing to it the SSN tied to the record in the DMO and the Lead ID of the record that was just created.

Image showing the action to create a platform event

This is what the completed show should look like:

Image showing the completed Data Cloud-triggered flow

Activate both flows. As data comes into Data Cloud, the flow is triggered invoking credit requests that are performed using an invocable action making HTTP calls to a credit provider. In this example, we demonstrate using the invocable action in a single flow, but it can be reused and triggered by multiple Data Cloud-triggered flows.

Conclusion

Data Cloud-triggered flows and invocable actions offer powerful tools to automate and streamline processes within Salesforce Data Cloud. With real-time data integration and the ability to invoke custom code using invocable actions, you can leverage a system built to handle large volumes of data and high-frequency updates. Utilizing an event-driven architecture, it reacts to data changes rather than running on a schedule, making it more efficient and responsive.

Resources

About the authors

Gina Nichols is a Director on the Salesforce Data Cloud product team. She has a background as an architect working with some of Salesforce’s enterprise customers in the MarTech space. She is also a co-author of the award-winning, Transform the Consumer Experience Customer 360 Guide, which won an award of Merit at the STC Chicago competition. You can follow her on LinkedIn.

Marcelo Galvão is a Distinguished Enterprise Architect based in São Paulo. He has a background as an architect working with some of Salesforce’s enterprise customers on complex projects. He is passionate about transforming customers’ needs using Salesforce technology. You can follow him on LinkedIn.

Get the latest Salesforce Developer blog posts and podcast episodes via Slack or RSS.

Add to Slack Subscribe to RSS