Integration is an essential topic for Salesforce developers and architects. One type of integration has gained momentum in recent years: process integration, perhaps better known as event-driven integration. The Salesforce Core Platform supports event-driven integration (or event-driven architecture) through the publication and subscription of platform events to other systems. It can also be a great way to create loosely coupled processes on-platform, which is described in great detail in Frank Caron and Pete O’Connell’s article on Event-Driven App Architecture on the Customer 360 Platform.

Platform Events on Salesforce were introduced in 2017 and allowed customers to define custom event structures, and publish and subscribe to those events. They have since evolved into what’s now known as High Volume Platform Events (or HVPEs), which greatly expand the publish/subscribe allocations over the original offering (see Platform Event Allocations). The next generation of Platform Events, the Pub/Sub API, has been re-architected and optimized for performance and scale, even allowing for a new subscriber protocol, the gRPC.

There are various ways to publish and consume events from the Salesforce Event Bus, both on the platform and externally. This article will compare three specific ways to publish events: Change Data Capture (or CDC), Record-Triggered Flows, and of course, Apex. Keep in mind that the following comparisons are focused on publishing Platform Events only. This is not intended to be a guide to how you approach development on the Salesforce platform in general. Refer to the Official Salesforce Architects guide to Record-Triggered Automation to ensure a well-architected solution.

Change Data Capture

Change Data Capture, henceforth referred to as CDC, is the fastest and easiest way to get started publishing events. CDC sends notifications for created, updated, deleted, and undeleted records to a platform event subset of the supported custom or standard object. CDC is configurable through the Setup menu by selecting the object and adding it to the CDC Selected Entities section.

When an object is selected for CDC, an accompanying platform event object is created, easily identifiable by its name. For standard objects like Account, you can expect to see AccountChangeEvent, and for a custom object like Shipment__c, you will see Shipment__ChangeEvent. Configuring objects for CDC is simple and straightforward, and extending the default allocations with add-on licenses is easy and affordable (see Change Data Capture Allocations).

Remember that CDC publishes platform events for every record created, updated, deleted, and undeleted. This is an incredibly powerful way to enable process integration. This works great when you can predict that the number of changes in the CDC-configured objects will be well within your daily publish and subscribe limits. However, this can cause concern if there are non-user-based updates to your objects in the form of batch Apex updates, ETL jobs, and data loads. This can potentially overload your allocations and lead to failed event deliveries.

“non-user-based updates to your objects in the form of batch Apex updates, ETL jobs, and data loads …

can potentially overload your allocations and lead to failed event deliveries.”

You should be asking yourself now if there’s a way to avoid flooding subscribing systems with too many messages that aren’t welcome and relevant. There are several ways, and that is the subject of the rest of this article.

Introducing Filtered Streams

What if there was a way to avoid overloading the subscribing systems when batch jobs and ETLs update a large number of records? Generally Available in Winter ’23, Filtered Streams allow you to create a channel and configure it with a complex filter that CometD clients can subscribe to. With fewer events delivered to subscribers, they can make more efficient use of the event delivery allocation (see Filter Your Stream of Platform Events with Custom Channels).

Filters do allow you to avoid overloading external systems with excessive and unnecessary event notifications for CDC, but what if you want to prevent the publication of the messages in the first place? There are two recommendations for putting the filter ahead of the publication: record-triggered flows with filter and Apex triggers to filter event publishing.

Record-triggered flows with filters

An alternative method of publishing changed events that offer more control over the records published is to use a record-triggered flow. This method requires that you define a platform event, a step automatically done with CDC. Defining new platform events is similar to creating new custom objects, which includes providing a name and label, and defining custom fields. The custom fields defined on the platform event will consist of the important fields that you want to supply to the event consumer. Once you have defined a platform event, it is available in the Flow Builder for writing.

Creating a record-triggered flow to publish platform events is a low-code task, however, it is a development activity and should run through the application lifecycle management process. Suffice it to say, the steps that follow should be done in a sandbox, tracked in version control, and promoted through standard processes. You should also create an error-handling strategy for your flow-based automation, just like Apex, the 4th of the 7 Things Architects Should Know About Flow.

Create a new flow, select Record-Triggered Flow from the first screen, then supply the entry criteria to determine if the flow will execute when records for the specified objects are updated.

Record-triggered flow configuration Start screen.
In this example (1) Shipment__c is configured as the object to observe for changes. The event that will trigger this flow is when (2) a record is created or updated. The conditional requirements are set to (3) (OR), so when any of the following conditions are met (4), the flow execution continues. In short, when Shipment records are changed and the Status field is either “Shipped” or “Delivered,” the actions following the Start item will run immediately. In this case, we’re going to publish a platform event.

In flow terms, we will be creating a record (but we’re really publishing an event), so we’ll need a record variable. The following shows a variable named shipmentEventRecord to hold the publishing values.

Flow resource definition of record data type
Next, we need to assign the values from the change record that triggered the flow to the shipmentEventRecord. This is shown in the assignment element below.

Flow assignment component configuration.
The platform event record is ready to be published. Flows publish platform events using a Create Record element.

Flow Create Record component configuration.
That’s it! We’ve defined the start criteria, then used an assignment to map the incoming Shipment__c values to the ShipmentEvent__e record, and finally we used Create Records to publish the event. Albeit it is not as easy as configuring objects for CDC, and can require maintenance when the data model for the source object changes.

Note: This example does not include error handling. As mentioned above, you should always consider an error-handling strategy for your flow-based automation.

Entire Record-Triggered Flow to publish platform event.

Apex triggers to filter event publishing

Flows offer a lot of functionality that used to only be available by writing Apex triggers, so when would it be appropriate to filter event publications using Apex? There will always be scenarios when it might be better to use Apex than Flows. Here are just a few that come to mind:

  • Your organization has adopted a single trigger approach as a design principle. Using a single trigger approach/architecture helps with code traceability and enforces the order of execution. Check with your Salesforce architects and development leads to see if this applies.
  • You have very specific string manipulations for publishing events that cannot easily be accomplished using Flows. Flows are very powerful but don’t have the complete functionality of the Apex String Class.
  • Your development team’s standard approach is to write Apex. Salesforce’s motto is “Clicks before Code,” but many customers have sophisticated development requirements where writing code is preferred over co-mingling Apex with Process Builders, Workflows, and Flows.

Here are some considerations when using Apex to publish Platform Events.

  • Writing Apex triggers means that you must also write Apex unit tests, and when trigger code changes, so must the unit test. Make sure to include time in your development estimates to write proper unit tests.
  • Well-written Apex checks the SaveResults of publication, which needs to comply with your logging strategy

The following snippet is the Apex trigger equivalent of the flow from above:

Conclusion

Platform Events are a powerful way to enable loosely coupled, near-real-time communication between systems. There are several options for publishing events, ranging from no-code with Change Data Capture, low-code (sort of) with record-triggered flows, and pro-code using Apex triggers. Choose your event publishing solution carefully.

Summary Chart

Category Benefits Considerations
Change Data Capture (CDC) No-Code -Low-cost add-on allotments
-Ease of configuration
-Automatically creates custom fields in xChangeEvent structure
-Publishes all record changes, including those by ETL jobs, batch Apex updates, and other data loads
CDC with Filters No-Code -Reduction of delivered messages -Uses resources to publish records that will never get delivered
Record-Triggered Flows Low-Code -Admins & developers familiar with Flow can create and manage -Can require updates to the flow mappings as the data model changes
Apex Triggers Pro-Code -Highest level of control over filtering, order of execution, and complies with the single trigger approach -Can require updates to apex code and unit test coverage as the data model changes

About the Author

Michael Norton

Michael Norton is a Distinguished Technical Architect at Salesforce and is a Certified Development Lifecycle and Deployment Architect, Integration Architect, App Builder, Platform Developer I & II, and Administrator. He has over 30 years of software development experience and 14 years of implementation and expertise on the Salesforce Platform. Michael joined Salesforce in 2018 and is passionate about helping customers build scalable and maintainable applications.

Get the latest Salesforce Developer blog posts and podcast episodes via Slack or RSS.

Add to Slack Subscribe to RSS