Use Custom Scripts in Data 360 (Beta)
Create custom Python scripts for Data 360 batch transforms by using Data Custom Code SDK. Write and test your scripts locally, then deploy them to your sandbox. After testing in the sandbox, use a data kit to move your batch data transforms and custom code to production for processing live data.
Code extension is a pilot or beta service that is subject to the Beta Services Terms at Agreements - Salesforce.com or a written Unified Beta Agreement if executed by Customer, and applicable terms in the Product Terms Directory. Use of this pilot or beta service is at the Customer's sole discretion.
| Edition Table |
|---|
| Available in: Developer, Enterprise, Performance, and Unlimited Editions. See Data 360 edition availability. |
| Permission Sets Needed | |
|---|---|
| To use custom scripts in Data 360: | Permission set:
|
-
Make sure that you have access to a Data 360 org with appropriate permission sets.
-
Make sure that you have access to a Data 360 sandbox with appropriate permission sets. If you don’t have a sandbox, create one. See Create a Data 360 Sandbox.
-
Set up an external client app to enable the SDK to authenticate with your Data 360 sandbox. See Set Up an External Client App for Data Custom Code SDK.
Confirm that your Data 360 sandbox contains the data model objects (DMOs) and the data lake objects (DLOs) that your custom scripts reference.
Your script must read from and write to the same object type: DLOs to DLOs and DMOs to DMOs. Custom scripts read from your source data objects and write transformed results to different data objects, preserving your original data.
You must manually assign and audit appropriate governance tags on target DLOs or DMOs that your code extension scripts create or update. This ensures compliance with your organization's governance data policies. Automatic tag propagation for target objects will be supported soon.
Set up the SDK to develop and test custom Python scripts locally, and then deploy them to Data 360 for execution. See Set Up Data Custom Code SDK.
Write and test your custom transform logic by using your preferred local development environment. This iterative process helps you catch issues early and refine your logic for production readiness. See Write and Validate Custom Scripts.
All code execution logs are written to a dedicated Logs DLO (DataCustomCodeLogs__dll) for observability and troubleshooting. Data governance policies don't apply to the Logs DLO, and any user with access to the Logs DLO can view its contents. To prevent inadvertent exposure of sensitive information, ensure that your code extension scripts output only the information that is appropriate for users with access to the Logs DLO to see. Don't output PII, credentials, or other sensitive data to standard output (for example, Python print() statements or console logging).
The deployment process packages your Python script, its dependencies, and Data 360 configuration, and then uploads them to Data 360. See Deploy a Custom Script to Data 360 Sandbox Using UI.
Monitor your deployment to verify that it’s successful, and your custom script is ready for use. See Monitor Deployment Details.
Invoke your custom code by creating a batch data transform that uses your deployed custom script. See Invoke Custom Code by Creating a Batch Data Transform.
IRun your custom code by running the batch data transform that uses your deployed custom script. See Run Custom Code by Running a Batch Data Transform.
Monitor your custom script’s execution, review logs, and validate the output to verify that your script transforms the data as expected. If you find issues, return to Step 4 to modify and redeploy your script. See Run History.
Create a DevOps data kit in your sandbox, add your validated batch data transform, and include any referenced DLOs or DMOs that don't exist in production. When you add a batch data transform, the custom code it references is automatically included. Alternatively, you can add custom code directly to the data kit and create batch data transforms in production later to associate with the custom code. Review the publishing sequence, publish the data kit, add it to a package, and then deploy to production. See Migrate Custom Script Data to Production.
If you migrated a batch data transform, run your custom code in production by running the batch data transform, and monitor execution to verify that your custom transform works correctly. If you migrated only custom code without a batch transform, first invoke the migrated custom code by creating a batch data transform that uses it, and then run the batch data transform. See Invoke Custom Code by Creating a Batch Data Transform, Run Custom Code by Running a Batch Data Transform, and Run History.