Optimize API Calls and Data Structures to Improve Performance

There are several factors that can impact the performance of your API calls. You can improve API performance by implementing these best practices.

If you plan to import subscriber information for individual subscribers, use SOAP API to create or update a record on a subscriber list or a data extension. In REST API, send the information in a POST request using the contacts resource. If you pass an array of subscribers in a single call, the system still processes each subscriber individually.

You can improve performance by using a relational data model. Store less frequently accessed data in data extensions that are separate from the data that you use most often. This structure allows you to access and process data only as needed, instead of forcing the system to load unused subscriber attributes every time you reference a list in a call.

To import a larger number of subscribers, use the API to start an asynchronous activity.

Marketing Cloud Engagement optimizes all asynchronous activities to process data in large batches. Design your import definitions so that you can reuse them and avoid creating a definition for every data import. Creating a definition for each operation can result in performance issues.

Plan your imports to run consecutively instead of concurrently. Performing multiple imports to the same data extension or subscriber list at the same time adversely affects performance. In Enterprise 2.0 accounts, all business units in the same enterprise share subscribers. To improve performance, stagger your imports programmatically so that they don’t reduce your overall enterprise performance.

An ideal use case is to import subscribers to the All Subscriber list in your account, including profile data. Next, perform consecutive imports to individual lists as necessary. These additional imports don’t need to include profile attribute data, because the initial import to the All Subscribers list included it.

Marketing Cloud Engagement activities handle large amounts of data in an asynchronous operation. Use import activities, query activities, and filter activities to process this data without the need for multiple calls. Avoid using these activities for single records or rows in a data extension.

Plan your activities to reduce the amount of server and cache space necessary to perform. Dealing with smaller amounts of data allows the servers to better cache data and demand less resources for reading the data. For your filter and query activities, examine the fields you include in your SQL queries and make sure that you include only the necessary fields. For example, if your solution processes data through multiple filter and query activities in succession, only carry the necessary fields for each stage of the activity. Then use a final query activity to join the final results back into the original dataset. This process reduces the necessary resources required to store and retrieve the more data through each stage of the process.

For processes that involve importing a single row into a data extension, use SOAP API to create a DataExtension Object. This type of call adds information to a data extension instead of creating an entirely new data extension.

When you import multiple rows, plan your import activities that update as many rows as possible in an asynchronous import activity.

Avoid processes that create multiple import definitions and data extensions with single or a small number of records. These processes adversely affect performance. Use fewer data extensions with multiple rows and reuse import and send definitions whenever possible.

Deleting data extensions removes them from your account view, but these entities persist until a regular deprecation process removes them. By reusing a smaller number of data extensions, you limit the number of entities the system must account for and enable faster performance.

When you delete columns from data extensions, those columns persist as well. If you perform multiple delete operations in a single data extension, recreating the data extension can improve performance because it eliminates unnecessary columns and indexes. You can also implement these activities to permanently remove data from a data extension:

  • Overwrite Import Activity
  • Query Activity
  • The Clear Data feature

In triggered send calls, including a large amount of data with the subscriber request body slows the process. Limit the amount of data passed to the triggered send call at send time and include as much information in the message as possible. You can also use parameters in links to reduce the number of links and provide more usable tracking data. This process includes using AMPscript and personalization strings to retrieve information at send time.

For multiple triggered send calls, perform each triggered send in a single request to reduce the demand on the system. Ensure that you separate triggered send calls from other API calls to improve performance.

Use only the number of business units you need. Ensure that you run your imports and activities to subscriber lists on only one business unit in an Enterprise 2.0 account at a time. Marketing Cloud Engagement shares subscriber data across all business units, no matter which business unit you actually involve in the operation. Therefore, these operations can affect performance across all business units.

You can run imports and activities on multiple business units within a Lock-and-Publish or On-Your-Behalf account without a reduction in performance.

The amount of data stored and the processes performed on that data are different in each account. This table outlines the thresholds that we recommend you follow to optimize the performance of your account.

  • The Normal column indicates the optimum usage level for the listed entity.

  • The Aggressive column indicates situations that can result in performance issues.

  • The Extreme column indicates situations that can lead to significant performance degradation that requires immediate remediation.

    EntityNormalAggressiveExtreme
    Concurrent Imports123 or more
    Data Extensions in an Account0–10001000–10000>10000
    Enterprise 2.0 Business Units0–10001000–2000>2000
    Import and Filter Definitions0–10001000–2000>2000
    Lists and Groups in an Account0–10001000–10000>10000
    Objects in a Folder0–10001000–2000>2000
    Profile Attributes0–5050–100>100
    Profile Attributes in an Enterprise 2.0 Account0–5050–100>100
    Profile Attributes in an On-Your-Behalf or Lock-and-Publish Account0–5050–100>100
    Profile Attributes Included in an Account or Business Unit Filter Definition1–23–56 or more
    Rows in a Data Extension0–1 million1 million to 100 million>100 million
    Sender Profiles0–5050–100>100
    Subscribers0–2 million2 million to 10 million>10 million
    Users0–10001000–2000>2000