Step 4: Build and Test a Prompt in Prompt Builder

Now that Data 360 ingests and indexes your website content, and you've connected a retriever, create a prompt to tell the LLM how to answer questions using your retrieved content. The prompt defines how the model behaves, what sources it uses, how it responds when information is missing, and how the final answer is formatted.

Include these two parts in your prompt:

  1. Instructions and guidance for the LLM.
  2. Two parameters to allow the model to generate answers based on your indexed website content and not on general model knowledge.
  • User’s Question: The question coming from the user through the agent. This is a free-text input that receives the user’s question from the agent.
  • Knowledge Base Information: The content the retriever returns from the search index. This is the retriever output. It passes the relevant chunks and metadata from the search index into the prompt.

As you build your prompt, consider the following.

  • Use the next example as a starting point for your prompt.
  • Replace the example retriever resource name with the retriever you created for your search index. Adapt it to your own site, agent type, intended audience, and expected agent response.
  • Be explicit about what sources the model is allowed to use.
  • Clearly define what should happen when the retriever doesn't return a good answer.
  • Specify whether the answer should include citations or source URLs.
  • Tailor the formatting to the expected end-user experience.
  • Keep the instructions focused on the actual role of the agent.

To adjust the next prompt example to your own use case, you can prompt your choice of LLM to rewrite it for your specific site and other considerations. Here are some example modifications.

  • Replace Cirrus-specific language with your product or company terminology.
  • Adjust the tone from technical support to customer service, internal help desk, or product guidance.
  • Change the fallback behavior when no answer is found.
  • Change the response format.
  • Add or remove guardrails depending on what the agent should or should not answer.

Cirrus designs this product support prompt, where the model provides answers using only the indexed Cirrus content.

The template defines how the agent uses the retrieved data to answer a user. After the prompt is configured, it follows this work process.

  • The user’s question is passed into the prompt through Query.
  • The same question is used as the retriever’s search text.
  • The retriever returns the most relevant results from the indexed website content.
  • These results are passed into the prompt as Knowledge Base Information.
  • The LLM uses both the question and the retrieved content to generate a grounded answer. It keeps the response tied to your indexed website content instead of relying on general model knowledge.

To build the prompt:

  1. From the Data 360 app, find and select Agentforce Studio on the quick search bar.
  2. From the menu in the left, select Prompt Template.
  3. Click New Prompt Template.
  4. Select the Flex template type.
  5. Name the template and define the Inputs:
    • Create an input named User Query.
    • Set Type to Free Text.
    • Click Next.
  6. In the Prompt Editor, click Insert Resource to add your prompt.
  7. Paste your main prompt instructions into the prompt body.
  8. Insert the Query and Retriever parameters to the bottom of the prompt.
    • To add a User Query parameter, click Insert Resource on top, and select User Query.
    • To add a Knowledge Base Information parameter, click Insert Resource on top, and select Retrievers and then Configure Retrievers.
  9. Configure the Retriever within the prompt:
    • On the Resources tab on the left, select your retriever from the list.
    • Scroll down to Search Parameters.
    • Under Search Text, search and enter User Query.
    • Select the output fields: URL, Title, Chunks.
    • For Number of Results, select 10.
    • Click Apply and Insert, and then Save and Preview.
  10. To test your prompt, click the Preview Settings icon on the left menu, and enter a sample user question under User Query.
  11. Validate that these results are taking place.
    • The retriever returns relevant results.
    • The answer is grounded in the indexed content.
    • The response format follows your instructions.
    • Citations or source URLs appear as expected.
    • Fallback behavior works when no relevant content is found.

If the results are weak, refine the prompt wording, retriever configuration, or indexing approach before you activate your agent.

To build an agent from your prompt template, follow the documentation in Agentforce and Einstein Generative AI.