if I do this, I do I make sure to render previous ...
# ask-ai
s
if I do this, I do I make sure to render previous messages in the frontend? is there any react hook that can capture the full chat_ctx?
Copy code
Context preservation
By default, each new agent or task starts with a fresh conversation history for their LLM prompt. To include the prior conversation, set the chat_ctx parameter in the Agent or AgentTask constructor. You can either copy the prior agent's chat_ctx, or construct a new one based on custom business logic to provide the appropriate context.

from livekit.agents import ChatContext, function_tool, Agent

class TechnicalSupportAgent(Agent):
    def __init__(self, chat_ctx: ChatContext):
        super().__init__(
            instructions="""You are a technical support specialist. Help customers troubleshoot 
            product issues, setup problems, and technical questions.""",
            chat_ctx=chat_ctx
        )

class CustomerServiceAgent(Agent):
    # ...

    @function_tool()
    async def transfer_to_technical_support(self):
        """Transfer the customer to technical support for product issues and troubleshooting."""
        await self.session.generate_reply(instructions="Inform the customer that you're transferring them to the technical support team.")
        
        # Pass the chat context during handoff
        return TechnicalSupportAgent(chat_ctx=self.session.chat_ctx)

The complete conversation history for the session is always available in session.history.
t
Thanks for reaching out, Roman. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.