Create a ZoomInfo Data Stream

After you set up a ZoomInfo connection, you can create a data stream to start the flow of data from your ZoomInfo source to Data Cloud.

  1. In Data Cloud, on the Data Streams tab, click New.

  2. Under Other Sources, select the ZoomInfo connection source, and click Next.

  3. Select from the available ZoomInfo connections.

  4. Select the object that you want to import, and click Next. You can select one new object on the Available Objects tab or an existing object on the In Use Objects tab.

  5. Select the data lake object that you want to ingest data to, configure the object details, and manage your fields.

  6. Under Object Details, for Category, identify the type of data in the data stream. For primary key, select a unique field to identify a source record.

  7. If a composite key is required or a key doesn’t exist, you can create one using a formula field.

  8. (Optional) Select a record modified field.

  9. If data is received out of order, the record modified field provides a reference point to determine whether to update the record. The record with the most up-to-date timestamp is loaded.

  10. (Optional) To identify the data lineage of a record’s business unit, add the organization unit identifier.

  11. Click Next.

  12. For Data Space, if the default data space isn’t selected, assign the data stream to the appropriate data space.

  13. Click Deploy.

    The Refresh Mode is Full Refresh by default. A full refresh loads new data on the next refresh, and the previous data is deleted and replaced with the imported data.

  14. Set up the search filter for the connection. For example:

  1. Schedule the refresh frequency and time. You can schedule the data refresh on a  daily, weekly, and monthly basis. You can also manually update the data at any time.

  2. Click Deploy.

After a successful Last Run, you can see the number of records processed and the total records loaded in the data stream page. You can then map your data lake object to the semantic data model and use the data in segments, calculated insights, and other use cases.