Get Started with Einstein Studio

With Einstein Studio, you can configure foundation models and test prompts in a playground environment before deploying a model to production. See Configure and Test a Model with Model Playground.

You can also bring your own large language model (BYOLLM) from an LLM provider using your own account. Any BYOLLM is saved to the Model Library in Einstein Studio as a foundation model that you can configure. See Add a Foundation Model.

When you’re familiar with how to configure and manage your models with Einstein Studio, you can access these models using the Models API.

Support for configured models and BYOLLM is coming soon to the Models API. For now, you can get started with a set of standard configuration models. See Supported Models for Models API.

Access your Einstein Studio generative models with the Models API.