Using the Node Agentjs V1, we need to be able to g...
# ask-ai
p
Using the Node Agentjs V1, we need to be able to generate an acknowledgment to the caller after a response, before tool calls are invoked or executed. Do you have a suggestion on the best place to implement this?
t
Thanks for reaching out, Scott. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
I’m happy to guide you through generating an acknowledgment to the caller right after a response but before tool calls are executed in Node.js Agent v1. You can achieve this by overriding the
llmNode()
method in a custom
Agent
subclass. This approach lets you insert your acknowledgment logic immediately after the LLM generates a response and just before any tool calls happen. Here’s a friendly example to help you get started:
Copy code
import { voice, llm } from '@livekit/agents';

class MyAgent extends voice.Agent {
  async llmNode(
    chatCtx: llm.ChatContext,
    tools: llm.FunctionTool[],
    modelSettings: voice.ModelSettings,
  ): Promise<AsyncIterable<llm.ChatChunk | string>> {
    // First, proceed with the default LLM processing
    const chunks = await voice.Agent.default.llmNode(this, chatCtx, tools, modelSettings);

    // Then, create a warm acknowledgment message after the LLM response
    const acknowledgment: llm.ChatMessage = {
      id: `ack_${Date.now()}`,
      role: 'assistant',
      content: ['Just a moment, I’m processing your request...'],
      createdAt: new Date(),
    };

    // Add this acknowledgment into the chat context
    this.session.chatCtx.addMessage(acknowledgment);

    // Make sure the acknowledgment is visible to the caller
    this.session._conversationItemAdded(acknowledgment);

    // Finally, return the original LLM response chunks
    return chunks;
  }
}
This way, your caller will see a friendly acknowledgment before the tool execution starts. If you’d like me to walk you through any part of this or if you have more questions, just let me know—I’m here to help! Sources: livekit-agents/livekit/agents/voice/agent_activity.py | agents/src/llm/provider_format/openai.test.ts | Pipeline nodes and hooks | LiveKit Docs | Tool definition and use | LiveKit Docs
p
the problem is we are using the OAI Realtime workflow
t
Thank you so much for sharing that you're using the OAI Realtime workflow! To help you better, could you please share a bit more about how you’ve set things up? For example: • Are you including any custom logic or middleware in your workflow? • Would you like the acknowledgment to be part of the LLM response, or should it come as a separate message? • Are there any particular constraints or special requirements for how the acknowledgment should be delivered? Once I have a clearer picture, I’ll be more than happy to guide you through implementing the acknowledgment smoothly.
p
The agent will ask a question, and the caller will give a response, we need the agent to give an acknowledgment of the callers response before the agent decides if a Tool call should be called and invoked
t
I don't have the answer you're looking for. You could also try asking your question: • in one of the other Slack channels or • to https://deepwiki.com/livekit/livekit_composite which is trained on all LiveKit source code If you find the answer, please post it here to help others!