how can I pass some context to my llm node to send...
# ask-ai
a
how can I pass some context to my llm node to send it in the request
t
Thanks for reaching out, Aditya. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To pass context to your LLM node and include it in the request, you can use the
on_user_turn_completed
lifecycle hook. This hook is triggered when the user's turn ends, allowing you to modify the turn's content before the agent's reply. For example, you can retrieve relevant context and inject it into the chat context like this:
Copy code
from livekit.agents import ChatContext, ChatMessage

async def on_user_turn_completed(
    self, turn_ctx: ChatContext, new_message: ChatMessage,
) -> None:
    rag_content = await my_rag_lookup(new_message.text_content())
    turn_ctx.add_message(
        role="assistant",
        content=f"Additional information relevant to the user's next message: {rag_content}"
    )
If you want to permanently add messages to the chat history, you can use the
update_chat_ctx
method:
Copy code
async def on_user_turn_completed(
    self, turn_ctx: ChatContext, new_message: ChatMessage,
) -> None:
    rag_content = await my_rag_lookup(new_message.text_content())
    turn_ctx.add_message(role="assistant", content=rag_content)
    await self.update_chat_ctx(turn_ctx)
This approach allows you to dynamically include context in the LLM's request. Sources: Voice pipeline nodes | LiveKit Docs