Skip to main content

LLM Pre Prompts

When you implement AI feature in your app using large language models you need to prepare a good prompt to get the best results. Most developers put prompts directly in the code, which makes it hard to maintain and update them. To resolve this issue, FrontLLM introduces the concept of Pre Prompts.

The Pre Prompts feature allows you to create and manage prompts directly from the FrontLLM admin panel. Then you can easily call the predefined prompts in your code by their names. The Pre Prompts are inserted into the request.

To create and edit Pre Prompts, go to the FrontLLM admin panel, and edit your gateway. In the "Pre Prompts" section, you can add new Pre Prompts by providing a name and the prompt content.

How to Use Pre Prompts

Your code:

const gateway = frontLLM('<gateway_id>');

const response = await gateway.complete({
model: 'fast',
messages: [{ role: 'user', content: 'Hello world!' }],
pre_prompt_name: 'example_pre_prompt'
});

This request will be modified by FrontLLM to include the content of the Pre Prompt named example_pre_prompt before sending it to the LLM model. The final JSON request sent to the model will look like this:

{
"model": "fast",
"messages": [
{ "role": "system", "content": "This is an example pre prompt." },
{ "role": "user", "content": "Hello world!" }
]
}