No Results
Search Tips:
- Please consider misspellings
- Try different search keywords
The Ingestion API provides a RESTful interface that supports both streaming and bulk interaction patterns for loading data into Data Cloud. You can use a single Ingestion API connector for performing both bulk and streaming uploads.
Consider the data source and the problem you’re trying to solve when choosing the ingestion mode.
Use bulk ingestion when moving large amounts of data on a daily, weekly, or monthly schedule. A few example scenarios are:
Use streaming ingestion for small micro-batches of records being updated in near-real time, such as:
Here's an example use case to understand it better.
Imagine you’re an integrator working for Northern Trail Outfitters (NTO). You need to extract the runner profiles and activity logs from NTO’s Track My Run mobile app and load them into Data Cloud. The marketer you’re working with has indicated that he needs the last 90 days of historical data and wants all new and updated data on a go-forward basis.
You query the last 90 days from your data warehouse as requested and you chunk up the data into 150 MB CSV files. You then load the data into Data Cloud via the bulk ingestion, understanding that the bulk ingestion is an appropriate solution for one-time operations. Now that the initial dataset is loaded, you decide it’s best to synchronize future changes with an upsert operation via streaming ingestion. These updates can be forwarded to Data Cloud in micro-batches as soon as the data becomes available.
Get Started with Ingestion API
Before using Ingestion API in Data Cloud, complete the prerequisites, set up authentication, and know the limits that apply to bulk ingestion and streaming ingestion.
The Data Cloud Ingestion API uses a fire-and-forget pattern to synchronize micro-batches of updates between the source system and Data Cloud in near-real time. Data is processed asynchronously approximately every 3 minutes.
With the Data Cloud Ingestion API, you can upsert or delete large data sets. Prepare a CSV file for the data you want to upload, create a job, upload job data, and let Salesforce take care of the rest.
Data Cloud can ingest data in real-time using your existing Ingestion API integrations. To do so for an Ingestion API Data Lake Object (DLO), make sure that the Data Model Object (DMO) the DLO is mapped to, is a member of a real-time data graph.
See Also