How to periodically pass/push some information to ...
# ask-ai
b
How to periodically pass/push some information to agents?
t
Thanks for reaching out, Delivery. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Hi there! To help you periodically pass or push information to agents in LiveKit, here are a few friendly methods you can try out: • Data Publishing in the Room: Your agents can easily share data with other participants in the room using the
publish_data
method. For example:
await room.local_participant.publish_data("hello world")
Remote Procedure Call (RPC): This lets agents send structured information by calling RPC methods on remote participants. Here’s a quick example:
Copy code
await context.userdata.ctx.room.local_participant.perform_rpc(
    destination_identity=client_participant.identity,
    method="nutrition.update",
    payload=json.dumps({...})
)
Periodic Tasks with `asyncio.sleep`: You can set up a loop in the agent’s entrypoint to send updates regularly. Like this:
Copy code
while True:
    rtc_stats = await ctx.room.get_session_stats()
    # process or send stats
    await asyncio.sleep(5 * 60)  # wait 5 minutes
Job Metadata During Dispatch: When dispatching an agent, you can include metadata accessible through
JobContext
. For example:
Copy code
dispatch = await client.create_room_dispatch(
    room_name="my-room",
    agent_name="test-agent",
    metadata='{"user_id": "123", "name": "John"}'
)
If you’re aiming for high-frequency updates, I’d recommend using data packets or periodic RPC/publishing to keep things smooth and avoid synchronization overhead. Please don’t hesitate to reach out if you’d like a hand implementing any of these methods — I’m here to help! Sources: Inside a session | LiveKit Docs | Inside a session | LiveKit Docs | complex-agents/nutrition-assistant/agent.py | livekit-agents/livekit/agents/worker.py | Agent dispatch | LiveKit Docs
b
No, I need to send some information to the agent, not from the agent. Something additionally happening
t
I'd love to help you send information to an agent in LiveKit! Here are some friendly ways you can do that: • Text Streams: You can send real-time text input, like transcriptions or user messages, directly to the agent. It works similarly to spoken audio and is perfect if you’re focusing on text-only interactions. • Remote Procedure Calls (RPCs): These let you send structured data or trigger specific actions. For instance, you might call a function the agent provides to share metadata or other important info. • Agent Dispatch Metadata: When you dispatch an agent to a room, you can include custom metadata. This is super useful because the agent can access it within their job context and tailor their behavior for the session. • External HTTP Communication: Another option is to use webhooks or external HTTP requests to send data to a backend endpoint, which then passes it along to the agent through text streams or RPCs. If you’d like, I’m here to guide you through implementing any of these methods or chat about your specific use case. Just let me know! Sources: README.md | packages/livekit-server-sdk/src/AgentDispatchClient.ts | Text and transcriptions | LiveKit Docs | Text and transcriptions | LiveKit Docs | README.md
b
self.session.generate_reply method purpose
t
I'm happy to walk you through the
self.session.generate_reply
method in LiveKit! This handy method helps prompt the Large Language Model (LLM) to create a dynamic response during an agent session, and it can be used in two friendly ways: • With Instructions: You can give specific directions for the agent to craft a response. For example: `self.session.generate_reply(instructions="Greet the user warmly")`Just a heads-up—the instructions themselves aren't added to the chat history, only the response that gets generated is recorded. • With User Input: You can also provide user input directly, so the response is based on what the user said. For example: `self.session.generate_reply(user_input="I'm sorry, I couldn't transfer your call...")`In this case, the input is added to the chat history before the response is generated. This method is especially useful in lifecycle hooks like
on_enter
or
on_exit
to start off greetings or say farewells warmly. When you use it, it returns a
SpeechHandle
object, which you can await to make sure the speech finishes processing, and it also triggers a
speech_created
event. If you'd like any more details or some extra examples, just let me know—I’m here to help! Sources: Agent speech and audio | LiveKit Docs | Agent speech and audio | LiveKit Docs | Pipeline nodes and hooks | LiveKit Docs | Workflows | LiveKit Docs | Company directory phone assistant | LiveKit Docs
b
other similar methods?
t
I'd love to share some other helpful methods similar to
self.session.generate_reply
in LiveKit that you might find useful: • `session.say()`: This method lets the agent speak a predefined message using the configured TTS system. Unlike
generate_reply
, it doesn’t create responses using an LLM. For example: `await session.say("This is a fixed message.")`You also have the option to decide if the text should be added to the chat context by using the
add_to_chat_ctx
parameter. • Lifecycle Hook Methods: The
generate_reply
method is often used within lifecycle hooks like
on_enter
or
on_user_turn_completed
to create dynamic responses as the agent moves through different states. For instance,
on_enter
can use
generate_reply
to warmly greet the user when the agent becomes active. While
session.say
is a great choice for straightforward speech output,
generate_reply
plays a key role in more complex conversational flows, including task-based workflows and structured interactions. If you'd like, I’m here to provide more details or examples—just let me know! Sources: Agent speech and audio | LiveKit Docs | Pipeline nodes and hooks | LiveKit Docs | Workflows | LiveKit Docs