Create and Prompt
User Intent
"I want to create a conversation and ask questions with RAG (Retrieval Augmented Generation)"
Operation
SDK Method:
graphlit.createConversation()+graphlit.promptConversation()GraphQL:
createConversation+promptConversationmutationsEntity Type: Conversation
Common Use Cases: Q&A over documents, chatbots, context-aware AI responses, RAG applications
TypeScript (Canonical)
import { Graphlit } from 'graphlit-client';
import {
ConversationInput,
ConversationTypes,
EntityState,
ModelServiceTypes,
SpecificationTypes
} from 'graphlit-client/dist/generated/graphql-types';
const graphlit = new Graphlit();
// Step 1: Create a conversation
const conversationInput: ConversationInput = {
name: 'Document Q&A Session',
type: ConversationTypes.Content
};
const createResponse = await graphlit.createConversation(conversationInput);
const conversationId = createResponse.createConversation.id;
console.log(`Created conversation: ${conversationId}`);
// Step 2: Prompt the conversation (RAG query)
const promptResponse = await graphlit.promptConversation(
'What are the key findings in the document?', // prompt
conversationId // conversation ID
);
const answer = promptResponse.promptConversation.message.message;
console.log(`Answer: ${answer}`);
// Step 3: Continue the conversation (multi-turn)
const followUpResponse = await graphlit.promptConversation(
'Can you provide more details about the first finding?',
conversationId // Same conversation for context
);
console.log(`Follow-up answer: ${followUpResponse.promptConversation.message.message}`);Parameters
createConversation
name(string): Display name for the conversationtype(ConversationTypes): Conversation typeCONTENT- Q&A over ingested content (most common)MESSAGE- Chat-style conversations
specification(EntityReferenceInput): Optional LLM configurationSpecify model (GPT-4, Claude, etc.)
If omitted, uses project default
filter(ContentFilter): Optional filter to limit RAG contextFilter by collection, type, date, etc.
Only retrieve relevant content for answers
promptConversation
prompt(string): User's question or messageid(string): Conversation ID (optional)If provided: Continues existing conversation
If omitted: Creates new conversation automatically
specification(EntityReferenceInput): Override LLM for this prompttools(ToolDefinitionInput[]): Tools for function calling (advanced)systemPrompt(string): Override system instructions
Response
// createConversation
{
createConversation: {
id: string; // Conversation ID
name: string; // Conversation name
type: ConversationTypes; // CONTENT or MESSAGE
state: EntityState; // ENABLED
specification?: Specification; // LLM configuration
}
}
// promptConversation
{
promptConversation: {
messageCount: number; // Total messages in conversation
message: {
message: string; // AI's response
role: MessageRoles.ASSISTANT; // ASSISTANT
tokens?: number; // Tokens used
completionTime?: number; // Response time (ms)
citations?: Citation[]; // Source references (if RAG used)
}
}
}Developer Hints
Conversation ID Optional in promptConversation
// Method 1: Create conversation first (recommended for multi-turn)
const conv = await graphlit.createConversation({ name: 'Chat', type: ConversationTypes.Content });
const response = await graphlit.promptConversation('Hello', conv.createConversation.id);
// Method 2: Let promptConversation create conversation (single-turn queries)
const response = await graphlit.promptConversation('Hello'); // Creates conversation implicitlyWhen to create conversation explicitly:
Multi-turn conversations (preserves context)
Need to configure conversation (filter, specification)
Want to manage conversation lifecycle
When to skip explicit creation:
One-off questions
Stateless API endpoints
Quick prototyping
RAG Context from Ingested Content
promptConversation automatically retrieves relevant content for answers:
// 1. First, ingest content
await graphlit.ingestUri('https://example.com/product-docs.pdf', undefined, undefined, true);
// 2. Create conversation
const conv = await graphlit.createConversation({
name: 'Product Support',
type: ConversationTypes.Content
});
// 3. Ask questions - Graphlit retrieves relevant sections automatically
const response = await graphlit.promptConversation(
'How do I reset my password?',
conv.createConversation.id
);
// Response includes citations showing which content was used
console.log(`Answer: ${response.promptConversation.message.message}`);
console.log(`Sources: ${response.promptConversation.message.citations?.length || 0} citations`);Understanding Message Context
// First message
const msg1 = await graphlit.promptConversation('What is machine learning?', conversationId);
console.log(msg1.promptConversation.messageCount); // 2 (user + assistant)
// Second message (has context from first)
const msg2 = await graphlit.promptConversation('Can you give an example?', conversationId);
console.log(msg2.promptConversation.messageCount); // 4 (user + assistant + user + assistant)
// The conversation remembers previous messagesCitations for RAG Transparency
const response = await graphlit.promptConversation(
'What does the contract say about payment terms?',
conversationId
);
// Check citations to see which content was used
response.promptConversation.message.citations?.forEach(citation => {
console.log(`Source: ${citation.content?.name}`);
console.log(`Relevance: ${citation.relevance}`);
console.log(`Excerpt: ${citation.text}`);
});Variations
1. Conversation with Specific Model
Use a custom LLM specification:
// Create specification for Claude
const specInput: SpecificationInput = {
name: 'Claude Sonnet',
type: SpecificationTypes.Completion,
serviceType: ModelServiceTypes.Anthropic,
anthropic: {
model: AnthropicModels.Claude_3_5_Sonnet
}
};
const specResponse = await graphlit.createSpecification(specInput);
// Create conversation with specification
const convResponse = await graphlit.createConversation({
name: 'Chat with Claude',
type: ConversationTypes.Content,
specification: { id: specResponse.createSpecification.id }
});
// Prompts will use Claude
const response = await graphlit.promptConversation(
'Explain quantum computing',
convResponse.createConversation.id
);2. Conversation with Collection Filter
Limit RAG context to specific collections:
// Create collection
const collectionResponse = await graphlit.createCollection({
name: 'HR Policies'
});
// Create conversation filtered to HR collection
const convResponse = await graphlit.createConversation({
name: 'HR Support Bot',
type: ConversationTypes.Content,
filter: {
collections: [{ id: collectionResponse.createCollection.id }]
}
});
// Questions only retrieve from HR Policies collection
const response = await graphlit.promptConversation(
'What is the vacation policy?',
convResponse.createConversation.id
);
// Answer based only on content in HR Policies collection3. Multi-Turn Conversation with Context
Build context across multiple turns:
const conv = await graphlit.createConversation({
name: 'Data Analysis',
type: ConversationTypes.Content
});
const convId = conv.createConversation.id;
// Turn 1
const msg1 = await graphlit.promptConversation(
'What was our Q4 revenue?',
convId
);
console.log(msg1.promptConversation.message.message);
// Turn 2 (references "our" from turn 1)
const msg2 = await graphlit.promptConversation(
'How does that compare to Q3?',
convId
);
console.log(msg2.promptConversation.message.message);
// Turn 3 (references revenue topic from turns 1-2)
const msg3 = await graphlit.promptConversation(
'What drove the change?',
convId
);
console.log(msg3.promptConversation.message.message);4. Single-Turn Query (No Conversation Creation)
Quick one-off questions:
// No conversation ID - creates one automatically
const response = await graphlit.promptConversation(
'Summarize the key points in the uploaded documents'
);
const answer = response.promptConversation.message.message;
console.log(answer);
// Note: You lose the conversationId, can't continue later5. Conversation with Custom System Prompt
Override default behavior:
const response = await graphlit.promptConversation(
'What is AI?',
conversationId,
undefined, // specification
undefined, // mimeType
undefined, // data
undefined, // tools
undefined, // requireTool
'You are a helpful assistant that explains complex topics to 5-year-olds. Use simple language and fun analogies.' // systemPrompt
);
// Response will be in simple, child-friendly language6. Conversation with Date Filter
Ask questions about recent content:
const lastWeek = new Date();
lastWeek.setDate(lastWeek.getDate() - 7);
const conv = await graphlit.createConversation({
name: 'Recent News',
type: ConversationTypes.Content,
filter: {
creationDateRange: {
from: lastWeek,
to: new Date()
}
}
});
// Only retrieves from content ingested in last week
const response = await graphlit.promptConversation(
'What are the latest developments?',
conv.createConversation.id
);Common Issues
Issue: Response says "I don't have information about that" Solution: Ensure content is ingested and in FINISHED state. Check conversation filter isn't too restrictive.
Issue: Context from previous messages not working
Solution: Verify you're using the same conversationId for all prompts in the conversation.
Issue: Getting errors about missing specification Solution: Ensure project has a default LLM configured, or explicitly provide specification when creating conversation.
Issue: Citations not appearing in response Solution: Citations only appear when RAG retrieves content. For pure LLM responses (no content match), there are no citations.
Issue: Conversation creates successfully but promptConversation fails Solution: Check that content exists and matches the conversation filter. Try prompting without filter first.
Production Example
Server-side conversation creation:
const conversationInput: ConversationInput = {
name: userQuestion.substring(0, 80), // First 80 chars of question
type: ConversationTypes.Content,
specification: specificationId ? { id: specificationId } : undefined
};
const response = await graphlit.createConversation(conversationInput, userId);
const conversationId = response.createConversation.id;One-off query pattern:
// Create conversation implicitly without explicit ID
const response = await graphlit.promptConversation(
userQuestion,
undefined, // No conversation ID - creates new
undefined, // specification
undefined, // mimeType
undefined, // data
undefined, // tools
undefined, // requireTool
undefined, // systemPrompt
undefined, // includeDetails
userId // correlationId for tracking
);
const answer = response.promptConversation?.message?.message;Last updated
Was this helpful?