polite-furniture-24847
08/14/2025, 1:11 AMtall-airline-16756
08/14/2025, 1:11 AMtall-airline-16756
08/14/2025, 1:11 AMchat_ctx
parameter. This way, if you get connected to a new agent, they can easily catch up on what you’ve already shared.
• Model Context Protocol (MCP): MCP helps us keep track of the conversation across different sessions, so you don’t have to repeat yourself and the chat feels continuous.
• Interruption Handling: We’re on the lookout for any interruptions and handle them smoothly to keep the conversation flowing naturally.
• Inactivity Detection: We check if you’re still there, so we don’t repeat questions unnecessarily if you’ve stepped away for a bit.
Using these approaches, we aim to make your support experience as smooth and hassle-free as possible.
Sources: Workflows | LiveKit Docs | Model Context Protocol Integration | livekit/livekit_composite | DeepWiki | Turn detection and interruptions | LiveKit Docs | Voice Assistants | livekit/livekit_composite | DeepWikipolite-furniture-24847
08/14/2025, 1:16 AMtall-airline-16756
08/14/2025, 1:18 AMchat_ctx
parameter or Model Context Protocol (MCP), the agent remembers what the customer shared, so they don’t have to repeat themselves—making the whole experience smoother and more pleasant.polite-furniture-24847
08/14/2025, 1:22 AMtall-airline-16756
08/14/2025, 1:27 AMchat_ctx
. You’ll want to set it up explicitly when you’re initializing an agent or task, so your chat stays connected and makes sense.
Here’s a quick example:
class TechnicalSupportAgent(Agent):
def __init__(self, chat_ctx: ChatContext):
super().__init__(instructions="You are a technical support specialist.", chat_ctx=chat_ctx)
So, if you’re working with OpenAI’s LLM, just make sure to pass chat_ctx
during initialization. That way, your conversations will flow smoothly and remember what you’ve talked about before.
Sources: Workflows | LiveKit Docspolite-furniture-24847
08/14/2025, 1:36 AMtall-airline-16756
08/14/2025, 1:37 AMchat_ctx
. But if you’re seeing some memory hiccups or the same questions popping up again during the session, it might be related to how the session is managed or how the LLM is set up.
Here are a few friendly tips to keep the context flowing smoothly in a single session:
• Make sure the conversation history is being passed correctly to the LLM with every interaction.
• Double-check the LLM’s settings, like the temperature
or other parameters, to see if they match what you’re aiming for.
• Confirm that the input to the LLM includes the full conversation history so far—that’s key for keeping context intact.
If it’s still acting up, don’t worry—it might just need a bit more digging into your setup. You’ve got this!polite-furniture-24847
08/14/2025, 1:38 AMtall-airline-16756
08/14/2025, 1:38 AMChatContext
object to keep track of the conversation history.
2. Add Messages to Context: Use the add_message
method to add past messages, making sure to specify who said what (like "user" or "assistant") and the content.
3. Pass Context to Agent: When you kick off a new agent session, just pass your prepared ChatContext
to the agent’s constructor via the chat_ctx
parameter.
Here’s a quick example:
from livekit.agents import ChatContext
# Initialize ChatContext
chat_ctx = ChatContext.empty()
# Add messages to the context
chat_ctx.add_message(role="user", content="Hello, I need help.")
chat_ctx.add_message(role="assistant", content="How can I assist you today?")
# Start the agent session with the context
agent = Agent(
instructions="Your instructions here",
llm=...,
chat_ctx=chat_ctx
)
await session.start(agent=agent)
By following these steps, you’ll make sure the LLM gets the full conversation history, so it can respond in a way that really makes sense. If you have any questions or need a hand, just let me know!polite-furniture-24847
08/14/2025, 1:45 AMtall-airline-16756
08/14/2025, 1:46 AMchat_ctx
parameter in the Agent
constructor. Otherwise, it treats every turn like a brand new chat.
Sources: Workflows | LiveKit Docs | Workflows | LiveKit Docs | Testing and evaluation | LiveKit Docs