Links
Comment on page

Specifications

Create, manage and query LLM specifications.

Overview

When creating a conversation about content, you are utilizing a Large Language Model (LLM), such as OpenAI GPT-3.5. Each LLM provides a number of configuration settings, which allow you to tune the completion of the user prompt in various ways.
LLM specifications are a way to select a specific LLM and save its configuration, so it can be reused across multiple conversations.
Graphlit supports a number of LLMs, including Azure-hosted OpenAI, OpenAI-hosted, and Anthropic-hosted models.
In addition to model-specific parameters, specifications store the model's system prompt, which provides instructions to the LLM for how to complete the user prompt. This allows you to create multiple specifications as templates for various personas, such as a call-center agent or academic researcher.
For more information, see these sources:
Writing your system prompt to get the expected LLM output can take some trial and error, and OpenAI has written some best practices here.
Specifications provide several advanced configuration properties, such as conversation strategy, and the ability to tune the semantic search of content sources which get formatted into the LLM prompt. More details can be found here.
For more information on configuring these models in your specification, or for using your own model deployment or API key:
Create Specification
The createSpecification mutation enables the creation of a specification by accepting the specification name, serviceType and systemPrompt and it returns essential details, including the ID, name, state, type and service type of the newly generated specification.
The LLM systemPrompt is a text input provided to the model to instruct it on what kind of content or responses to generate. It serves as the initial message or query that sets the context and guides the model's behavior, helping it generate text that is relevant, informative, or creative, depending on the user's needs.
When using theOPEN_AI service type, you can assign the openAI parameters for the model, model temperature, model probability and the completionTokenLimit.
When using the AZURE_OPEN_AI service type, you can assign the azureOpenAI parameters for the model, model temperature, model probability and the completionTokenLimit.
When using the ANTHROPIC service type, you can assign the anthropic parameters for the model, model temperature, model probability and the completionTokenLimit.
Anthropic, OpenAI and Azure OpenAI models accept the model sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
Model probability is an alternative to sampling with temperature where the model considers the results of the tokens with probability mass. 1 is the default value. So 0.1 means only the tokens comprising the top 10% probability mass are considered.
Completion token limit is the maximum number of tokens which the LLM will return in the prompt completion.
Mutation:
mutation CreateSpecification($specification: SpecificationInput!) {
createSpecification(specification: $specification) {
id
name
state
type
serviceType
}
}
Variables:
{
"specification": {
"serviceType": "OPEN_AI",
"systemPrompt": "You are a Machine Learning researcher, who is intelligent, experienced and detail oriented. Use the provided content sources to answer the request the user has sent. Please cite the sources and relevant pages numbers with your answer, as if you were writing technical documentation. Combine any sources within the same document.",
"openAI": {
"model": "GPT35_TURBO",
"temperature": 0.0,
"completionTokenLimit": 256
},
"name": "Machine Learning Researcher"
}
}
Response:
{
"type": "COMPLETION",
"serviceType": "OPEN_AI",
"id": "bf20d121-8332-405f-bfe2-7789b9e19215",
"name": "Machine Learning Researcher",
"state": "ENABLED"
}
Update Specification
The updateSpecification mutation enables the renaming of a specification by accepting the specification name.
Mutation:
mutation UpdateSpecification($specification: SpecificationUpdateInput!) {
updateSpecification(specification: $specification) {
id
name
state
type
}
}
Variables:
{
"specification": {
"id": "bf20d121-8332-405f-bfe2-7789b9e19215",
"name": "ML Researcher"
}
}
Response:
{
"type": "SPECIFICATION",
"id": "bf20d121-8332-405f-bfe2-7789b9e19215",
"name": "ML Researcher",
"state": "ENABLED"
}
Delete Specification
The deleteSpecification mutation allows the deletion of a specification by utilizing the id parameter, and it returns the ID and state of the deleted specification.
Mutation:
mutation DeleteSpecification($id: ID!) {
deleteSpecification(id: $id) {
id
state
}
}
Variables:
{
"id": "bf20d121-8332-405f-bfe2-7789b9e19215"
}
Response:
{
"id": "bf20d121-8332-405f-bfe2-7789b9e19215",
"state": "DELETED"
}
Get Specification
The specification query allows you to retrieve specific details of a specification by providing the id parameter, including the ID, name, creation date, state, owner ID, and type associated with the specification. You can also retrieve the openai details for the LLM specification.
Query:
query GetSpecification($id: ID!) {
specification(id: $id) {
id
name
creationDate
state
owner {
id
}
type
serviceType
systemPrompt
messageLimit
openAI {
tokenLimit
overlapTokens
completionTokenLimit
modelName
temperature
}
}
}
Variables:
{
"id": "bf20d121-8332-405f-bfe2-7789b9e19215"
}
Response:
{
"type": "COMPLETION",
"serviceType": "OPEN_AI",
"systemPrompt": "You are a Machine Learning researcher, who is intelligent, experienced and detail oriented. Use the provided content sources to answer the request the user has sent. Please cite the sources and relevant pages numbers with your answer, as if you were writing technical documentation. Combine any sources within the same document.",
"openAI": {
"modelName": "gpt-3.5-turbo",
"temperature": 0.0,
"completionTokenLimit": 256
},
"id": "bf20d121-8332-405f-bfe2-7789b9e19215",
"name": "Machine Learning Researcher",
"state": "ENABLED",
"creationDate": "2023-07-04T01:12:31Z",
"owner": {
"id": "9422b73d-f8d6-4faf-b7a9-152250c862a4"
}
}
Query Specifications
The specifications query allows you to retrieve all specifications. It returns a list of specification results, including the ID, name, creation date, state, owner ID, and type for each specification. It also allows you to retrieve the specific model parameters.
Query Specifications
Query:
query QuerySpecifications($filter: SpecificationFilter!) {
specifications(filter: $filter) {
results {
id
name
creationDate
state
owner {
id
}
type
serviceType
systemPrompt
messageLimit
openAI {
tokenLimit
overlapTokens
completionTokenLimit
modelName
temperature
}
}
}
}
Variables:
{
"filter": {
"offset": 0,
"limit": 100
}
}
Response:
{
"results": [
{
"type": "COMPLETION",
"serviceType": "OPEN_AI",
"systemPrompt": "You are a Machine Learning researcher, who is intelligent, experienced and detail oriented. Use the provided content sources to answer the request the user has sent. Please cite the sources and relevant pages numbers with your answer, as if you were writing technical documentation. Combine any sources within the same document.",
"openAI": {
"modelName": "gpt-3.5-turbo",
"temperature": 0.0,
"completionTokenLimit": 256
},
"id": "bf20d121-8332-405f-bfe2-7789b9e19215",
"name": "Machine Learning Researcher",
"state": "ENABLED",
"creationDate": "2023-07-04T01:12:31Z",
"owner": {
"id": "9422b73d-f8d6-4faf-b7a9-152250c862a4"
}
}
]
}
Query Specifications By Name
The specifications query allows you to retrieve specifications based on a specific filter criteria, via the name parameter. In this example, the name is set to "Researcher." It returns a list of specification results containing the ID, name, creation date, state, owner ID, and type for each matching specification.
Query:
query QuerySpecifications($filter: SpecificationFilter!) {
specifications(filter: $filter) {
results {
id
name
creationDate
state
owner {
id
}
type
serviceType
systemPrompt
messageLimit
openAI {
tokenLimit
overlapTokens
completionTokenLimit
modelName
temperature
}
}
}
}
Variables:
{
"filter": {
"name": "Researcher",
"offset": 0,
"limit": 100
}
}
Response:
{
"results": [
{
"type": "COMPLETION",
"serviceType": "OPEN_AI",
"systemPrompt": "You are a Machine Learning researcher, who is intelligent, experienced and detail oriented. Use the provided content sources to answer the request the user has sent. Please cite the sources and relevant pages numbers with your answer, as if you were writing technical documentation. Combine any sources within the same document.",
"openAI": {
"modelName": "gpt-3.5-turbo",
"temperature": 0.0,
"completionTokenLimit": 256
},
"id": "bf20d121-8332-405f-bfe2-7789b9e19215",
"name": "Machine Learning Researcher",
"state": "ENABLED",
"creationDate": "2023-07-04T01:12:31Z",
"owner": {
"id": "9422b73d-f8d6-4faf-b7a9-152250c862a4"
}
}
]
}
Queries
Mutations
Objects