Create and Prompt
User Intent
"I want to create a conversation and ask questions with RAG (Retrieval Augmented Generation)"
Operation
SDK Method:
graphlit.createConversation()+graphlit.promptConversation()GraphQL:
createConversation+promptConversationmutationsEntity Type: Conversation
Common Use Cases: Q&A over documents, chatbots, context-aware AI responses, RAG applications
TypeScript (Canonical)
import { Graphlit } from 'graphlit-client';
import {
ConversationInput,
ConversationTypes,
EntityState,
ModelServiceTypes,
SpecificationTypes
} from 'graphlit-client/dist/generated/graphql-types';
const graphlit = new Graphlit();
// Step 1: Create a conversation
const conversationInput: ConversationInput = {
name: 'Document Q&A Session',
type: ConversationTypes.Content
};
const createResponse = await graphlit.createConversation(conversationInput);
const conversationId = createResponse.createConversation.id;
console.log(`Created conversation: ${conversationId}`);
// Step 2: Prompt the conversation (RAG query)
const promptResponse = await graphlit.promptConversation(
'What are the key findings in the document?', // prompt
conversationId // conversation ID
);
const answer = promptResponse.promptConversation.message.message;
console.log(`Answer: ${answer}`);
// Step 3: Continue the conversation (multi-turn)
const followUpResponse = await graphlit.promptConversation(
'Can you provide more details about the first finding?',
conversationId // Same conversation for context
);
console.log(`Follow-up answer: ${followUpResponse.promptConversation.message.message}`);Parameters
createConversation
name(string): Display name for the conversationtype(ConversationTypes): Conversation typeCONTENT- Q&A over ingested content (most common)MESSAGE- Chat-style conversations
specification(EntityReferenceInput): Optional LLM configurationSpecify model (GPT-4, Claude, etc.)
If omitted, uses project default
filter(ContentFilter): Optional filter to limit RAG contextFilter by collection, type, date, etc.
Only retrieve relevant content for answers
promptConversation
prompt(string): User's question or messageid(string): Conversation ID (optional)If provided: Continues existing conversation
If omitted: Creates new conversation automatically
specification(EntityReferenceInput): Override LLM for this prompttools(ToolDefinitionInput[]): Tools for function calling (advanced)systemPrompt(string): Override system instructions
Response
Developer Hints
Conversation ID Optional in promptConversation
When to create conversation explicitly:
Multi-turn conversations (preserves context)
Need to configure conversation (filter, specification)
Want to manage conversation lifecycle
When to skip explicit creation:
One-off questions
Stateless API endpoints
Quick prototyping
RAG Context from Ingested Content
promptConversation automatically retrieves relevant content for answers:
Understanding Message Context
Citations for RAG Transparency
Variations
1. Conversation with Specific Model
Use a custom LLM specification:
2. Conversation with Collection Filter
Limit RAG context to specific collections:
3. Multi-Turn Conversation with Context
Build context across multiple turns:
4. Single-Turn Query (No Conversation Creation)
Quick one-off questions:
5. Conversation with Custom System Prompt
Override default behavior:
6. Conversation with Date Filter
Ask questions about recent content:
Common Issues
Issue: Response says "I don't have information about that" Solution: Ensure content is ingested and in FINISHED state. Check conversation filter isn't too restrictive.
Issue: Context from previous messages not working
Solution: Verify you're using the same conversationId for all prompts in the conversation.
Issue: Getting errors about missing specification Solution: Ensure project has a default LLM configured, or explicitly provide specification when creating conversation.
Issue: Citations not appearing in response Solution: Citations only appear when RAG retrieves content. For pure LLM responses (no content match), there are no citations.
Issue: Conversation creates successfully but promptConversation fails Solution: Check that content exists and matches the conversation filter. Try prompting without filter first.
Production Example
Server-side conversation creation:
One-off query pattern:
Last updated
Was this helpful?