Bulk Data Ingest API

Managing large volumes of data in Marketing Cloud Engagement involves significant challenges, especially when importing millions of rows daily.

Traditional methods, like using standard REST APIs, can lead to performance issues and impact other critical processes in scenarios where you:

  • Capture high-volume data during a time-sensitive event, such as a concert signup/registration or breaking news broadcast.
  • Transfer audience segments from a Customer Data Platform to Marketing Cloud Engagement.

The Bulk Data Ingest API is specifically designed to help you manage high-volume data imports without compromising performance or affecting other services.

Here’s the process you follow for bulk data ingestion to ensure efficient and smooth data management in your Marketing Cloud Engagement account:

Define the destination Data Extension, specify how to handle data, and set the job's validity period.

To set up your ingest job:

  1. Create a job definition:
    • Use the Create bulk ingest job API to create a job definition that acts as a blueprint for your data import.
    • Specify the target data extension (destinationCustomerKey) and how you want to handle updates (updateType).
    • Choose JSON as the content type for your data.
    • Set an expiration time for the job definition.
  2. Receive a job ID:
    • Once Marketing Cloud Engagement validates your job definition, it returns a unique job ID (bulkApiDefinitionId).
    • This ID is essential for all subsequent interactions with the API for this specific job.

Upload your data in manageable chunks to the staging area, ensuring it is structured correctly and meets all validation rules.

With the job definition and job ID in place, you can start staging your data:

  1. Upload data in batches:
    • Use the Stage data for bulk ingest job API to send your data to Marketing Cloud Engagement in smaller, manageable batches.
    • Each batch is uploaded as a separate file to a temporary staging area.
  2. Structure data with JSON object:
    • Ensure keys in your JSON objects match column names in your data extension.
    • Include values for all required columns.
    • Rows missing primary key values are skipped during import.

Finalize the data staging process and begin the import into your data extension.

Once all your data is staged, follow these steps:

  1. Indicate job completion:
    • Use the Complete data staging API to confirm that you've finished staging your data.
    • This triggers validation and begins the import process.
  2. Monitor import progress:
    • Use the API to track ‌the import job's progress and status.
    • Address any errors or issues that may arise during the import.

Monitor the job's progress, retrieve job details, and manage job status as needed.

The Bulk Data Ingest API provides tools for managing your jobs:

  • Retrieve job details: Get information about a specific job's status and configuration.
  • Delete jobs: Remove jobs that haven't started processing.
  • Get job summaries: Review the outcome of completed jobs, including processed rows and errors.

To view reference content for the Bulk Data Ingest APIs, see Bulk Data Ingest API.