Create a PostgreSQL Data Stream

Create a data stream to start the flow of data from your PostgreSQL source to create a data lake object (DLO) in Data Cloud.

Before you begin:

  • Make sure the PostgreSQL connection is set up.
  • Review the IP addresses to make sure the PostgreSQL connection has the necessary access.
  1. In Data Cloud, on the Data Streamstab, click New.

    You can also use App Launcher to find and select Data Streams.

  2. Under Other Sources, select the PostgreSQL connection source, and click Next.

  3. Select from the available PostgreSQL connections.

  4. Select the object that you want to import, and click Next.

    You can select one new object on the Available Objects tab or an existing object on the In Use Objects tab.

  5. Under Object Details, for Category, identify the type of data in the data stream.

  6. For Primary Key, select a unique field to identify a record.

    If a primary key isn’t listed in the dropdown, you must create one using a formula field.

    1. To create a formula field, click New Formula Field.

    2. For Field Label, enter the data stream field’s display name.

    3. For Field API Name, enter the data stream field’s programmatic reference.

    4. For Formula Return Type, select Text.

    5. In the Transformation Formula text box, enter a UUID() formula.

    6. To validate the formula, click Text.

    7. Click Save.

    8. For Primary Key, select the UUID that you created.

  7. (Optional) Select a record modified field.

    If data is received out of order, the record modified field provides a reference point to determine whether to update the record. The record with the most up-to-date timestamp is loaded.

  8. (Optional) For Organization Unit Identifier, select a business unit to use in a record’s data lineage.

  9. Click Next.

  10. For Data Space, if the default data space isn’t selected, assign the data stream to the appropriate data space.

  11. Click Deploy.

When the Last Run Status displays success, you can see how many records were processed and the total number of records that were loaded.

You can now map your newly created DLO to the semantic data model to use the data in segments, calculated insights, and other use cases.