How can I do some context engineering to reduce th...
# ask-ai
a
How can I do some context engineering to reduce the conversation history in a multi-turn conversation? I want to reduce the chat history to the last 2-4 turns, but I also want to provide the AI agent with a summary of what has happened before
t
Thanks for reaching out, Anthony. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To help you reduce conversation history to the last 2–4 turns while still capturing the important context, here are a few friendly approaches you might find useful: • Truncate Chat Context: You can use LiveKit’s
truncate()
method to keep just the last 2–4 message exchanges. This way, you retain the most recent conversations along with the initial system message that holds your instructions. • Dynamic Truncation: Another great option is to use the
before_llm_cb
callback in the
VoicePipelineAgent
, which lets you dynamically limit the context right before each LLM call. • Manual Summarization: You might also consider creating a summary of earlier interactions with a custom method like
UserData.summarize()
, then adding it as a system message when transferring agents or resetting context. Since LiveKit doesn’t have built-in summarization APIs, you’ll need to handle summarization yourself. If you’d like, I’d be more than happy to share some detailed code examples for any of these methods—just let me know! Sources: livekit-plugins/livekit-plugins-aws/livekit/plugins/aws/experimental/realti... | agents/src/llm/chat_context.ts | VoicePipelineAgent | LiveKit Docs | VoicePipelineAgent | LiveKit Docs | complex-agents/medical_office_triage/README.md