how to make sure the agent always requests custome...
# ask-ai
w
how to make sure the agent always requests customer to give it a moment while it performs some actions using tool calling?
t
Thanks for reaching out, Sunil. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To make sure the agent kindly asks the customer to wait while performing actions, you can use the
SpeechHandle
object to smoothly manage the agent's speech flow. Here’s a simple way to do it: Steps: 1. Create a reply with
session.generate_reply()
to gently let the customer know to wait. For example:
handle = session.generate_reply(instructions="Please wait while I perform this action.")
2. Carry out the action or tool execution needed. 3. Use
await handle.wait_for_playout()
to pause until the message finishes, making sure the customer has been informed before moving on. Example Code:
Copy code
handle = session.generate_reply(instructions="Please wait while I perform this action.")

# Perform the action or tool execution here
await handle.wait_for_playout()
This way, the agent clearly communicates the wait time and waits for the customer’s understanding before continuing. If you have any questions or want to chat more about this, just let me know!
w
give me the complete code
t
Here’s a complete code example that shows how to politely ask the customer to hang tight while the agent is working on something:
Copy code
import asyncio
from livekit.agents import function_tool, RunContext

# Define the function with appropriate delay and speech handling
@function_tool
async def perform_slow_operation(rc: RunContext):
    # Create a speech handle to let the user know we’re processing their request
    handle = await rc.session.generate_reply(
        instructions="Please wait while I process your request."
    )

    try:
        # Perform the slow operation (simulated with asyncio.sleep)
        await asyncio.sleep(5)  # Simulate a slow operation

        # Wait for the speech to finish
        await handle

        # Let the user know the operation is done
        await rc.session.generate_reply(
            instructions="The operation is complete. Here are the results."
        )

    except asyncio.CancelledError:
        # Handle if the user stops the agent
        if handle.interrupted:
            await rc.session.generate_reply(
                instructions="Operation was cancelled."
            )
Key Features:Speech Handling: Uses
generate_reply()
to gently keep the user informed. • Background Operations: Simulates a slow operation with
asyncio.sleep()
. • Error Handling: Includes try-except blocks to gracefully manage interruptions. • Interrupt Check: Checks
handle.interrupted
to see if the user stopped the process. If you want me to walk you through any part of this or need a hand with something else, just let me know!
w
is it not possible to handle this via a generic system instruction so that the LLM can generate a response to customer before invoking tool?
t
Absolutely, you can handle this using a generic system instruction to make sure the LLM responds to the customer before calling any tools. Here’s a simple way to do it: Steps: 1. Start by using the
update_instructions
method to set system-level guidance for the LLM’s behavior. For example:
Copy code
session.update_instructions(
    "First, provide a response to the user. Then, invoke any necessary tools."
)
2. Next, generate the initial response with the
tool_choice
parameter set to
"none"
so no tools are triggered right away:
Copy code
session.generate_reply(
    instructions="First, provide a response to the user.",
    tool_choice="none"
)
3. After that first response, you can allow tool invocation by setting
tool_choice
to
"auto"
or
"required"
, depending on what you need:
Copy code
session.generate_reply(
    tool_choice="auto"
)
Key Features:System Instructions: The
update_instructions
method helps you guide the LLM to respond first before using any tools. • Tool Choice Parameter: This ensures tools only kick in after the initial reply, keeping things smooth and clear. If you’d like me to walk you through it more or help set it up, just let me know—I’m here to help! Sources: livekit-agents/livekit/agents/voice/agent_activity.py | Function calling with Voice Agents | LiveKit Docs | common/chat.cpp