Replicate
Configure Replicate specification.
When using Replicate LLMs, you have the choice of using a built-in Graphlit model, which accrues credits for usage, or using your own Replicate API key.
Graphlit currently supports these built-in Replicate models via the model
field: MISTRAL_7B_INSTRUCT
, MISTRAL_7B
, LLAMA_2_7B_CHAT
, LLAMA_2_7B
, LLAMA_2_13B_CHAT
, LLAMA_2_13B
, LLAMA_2_70B_CHAT
, and LLAMA_2_70B.
Currently, Replicate models can only be used with the promptSpecifications
mutation, and are not supported for conversations with promptConversation
. The Llama 2 and Mistral models are not yet reliable for returning JSON, which is a requirement for the current Graphlit RAG implementation.
You can find the list of current Replicate models here, and use the versioned model name syntax such as meta/llama-2-7b-chat:13c3cdee13ee059ab779f0291d29054dab00a47dad8261375654de5540165fb0
.
By assigning the model
field to CUSTOM
, you also will need to assign the key
and modelName
to use your own Replicate developer account.
Last updated
Was this helpful?