cold-waiter-86140
09/12/2025, 3:46 PMtall-airline-16756
09/12/2025, 3:46 PMtall-airline-16756
09/12/2025, 3:47 PMChatContext
object to store the sequence of messages and tool interactions within a session, which helps agents remember prior messages during a chat. That said, there are a few things to keep in mind:
• Each session starts fresh by default, so if you want to carry over context between agents or sessions, you’ll need to explicitly pass the chat_ctx
parameter when setting up a new agent.
• There are some technical limits in real-time pipelines, like a maximum of 40 messages and a 1 kB size limit per message. When these limits are reached, older messages get trimmed to make room for new ones.
• LiveKit doesn’t save message history beyond the session itself, so if you want to keep conversations longer term, you’ll want to set up your own storage solution.
If you’re working with long-running conversations and need more advanced memory management, integrating with external tools like Letta or LlamaIndex can be a great option. They offer stateful memory and can help extend the context your agents work with.
Sources: Workflows | LiveKit Docs | LLM Integrations | livekit/livekit_composite | DeepWiki | Testing and evaluation | LiveKit Docs | livekit-agents/livekit/agents/llm/remote_chat_context.py | Docs::Homecold-waiter-86140
09/12/2025, 3:49 PMtall-airline-16756
09/12/2025, 3:49 PMsession.history
property, and the chat_ctx
holds onto this context unless it’s specifically reset.
So, as long as your session is still active, the agent can easily look back at earlier parts of your conversation, no matter how much time has passed. I hope this helps clarify things for you!
Sources: Workflows | LiveKit Docs | Workflows | LiveKit Docs | tests/test_agent_session.py | Testing and evaluation | LiveKit Docs | Testing and evaluation | LiveKit Docs