powerful-dog-59275
09/12/2025, 4:47 PMtall-airline-16756
09/12/2025, 4:47 PMtall-airline-16756
09/12/2025, 4:48 PMllmNode()
method in a custom Agent
subclass. This approach lets you insert your acknowledgment logic immediately after the LLM generates a response and just before any tool calls happen.
Here’s a friendly example to help you get started:
import { voice, llm } from '@livekit/agents';
class MyAgent extends voice.Agent {
async llmNode(
chatCtx: llm.ChatContext,
tools: llm.FunctionTool[],
modelSettings: voice.ModelSettings,
): Promise<AsyncIterable<llm.ChatChunk | string>> {
// First, proceed with the default LLM processing
const chunks = await voice.Agent.default.llmNode(this, chatCtx, tools, modelSettings);
// Then, create a warm acknowledgment message after the LLM response
const acknowledgment: llm.ChatMessage = {
id: `ack_${Date.now()}`,
role: 'assistant',
content: ['Just a moment, I’m processing your request...'],
createdAt: new Date(),
};
// Add this acknowledgment into the chat context
this.session.chatCtx.addMessage(acknowledgment);
// Make sure the acknowledgment is visible to the caller
this.session._conversationItemAdded(acknowledgment);
// Finally, return the original LLM response chunks
return chunks;
}
}
This way, your caller will see a friendly acknowledgment before the tool execution starts. If you’d like me to walk you through any part of this or if you have more questions, just let me know—I’m here to help!
Sources: livekit-agents/livekit/agents/voice/agent_activity.py | agents/src/llm/provider_format/openai.test.ts | Pipeline nodes and hooks | LiveKit Docs | Tool definition and use | LiveKit Docspowerful-dog-59275
09/12/2025, 4:48 PMtall-airline-16756
09/12/2025, 4:49 PMpowerful-dog-59275
09/12/2025, 4:50 PMtall-airline-16756
09/12/2025, 4:51 PM