Specifications
Create, manage and query LLM specifications.
Last updated
Create, manage and query LLM specifications.
Last updated
When creating a conversation about content, you are utilizing a Large Language Model (LLM), such as OpenAI GPT-4o. Each LLM provides a number of configuration settings, which allow you to tune the completion of the user prompt in various ways.
LLM specifications are a way to select a specific LLM and save its configuration, so it can be reused across multiple conversations.
Graphlit supports LLMs from a wide variety of hosted services:
Azure-hosted OpenAI
OpenAI
Anthropic
Mistral
Groq
Deepseek
Replicate
For more information on configuring these models in your specification, or for using your own model deployment or API key:
In addition to model-specific parameters, specifications store the model's system prompt, which provides instructions to the LLM for how to complete the user prompt. This allows you to create multiple specifications as templates for various personas, such as a call-center agent or academic researcher.
Specifications provide several advanced configuration properties, such as conversation strategy, and the ability to tune the semantic search of content sources which get formatted into the LLM prompt. More details can be found here.
For more information, see these sources:
Writing your system prompt to get the expected LLM output can take some trial and error, and OpenAI has written some best practices here and here.
Deepseek
Azure OpenAI
OpenAI
Anthropic
Mistral
Groq
Replicate