if I do this, I do I make sure to render previous ...
# ask-ai
s
if I do this, I do I make sure to render previous messages in the frontend? is there any react hook that can capture the full chat_ctx?
Copy code
Context preservation
By default, each new agent or task starts with a fresh conversation history for their LLM prompt. To include the prior conversation, set the chat_ctx parameter in the Agent or AgentTask constructor. You can either copy the prior agent's chat_ctx, or construct a new one based on custom business logic to provide the appropriate context.

from livekit.agents import ChatContext, function_tool, Agent

class TechnicalSupportAgent(Agent):
    def __init__(self, chat_ctx: ChatContext):
        super().__init__(
            instructions="""You are a technical support specialist. Help customers troubleshoot 
            product issues, setup problems, and technical questions.""",
            chat_ctx=chat_ctx
        )

class CustomerServiceAgent(Agent):
    # ...

    @function_tool()
    async def transfer_to_technical_support(self):
        """Transfer the customer to technical support for product issues and troubleshooting."""
        await self.session.generate_reply(instructions="Inform the customer that you're transferring them to the technical support team.")
        
        # Pass the chat context during handoff
        return TechnicalSupportAgent(chat_ctx=self.session.chat_ctx)

The complete conversation history for the session is always available in session.history.
t
Thanks for reaching out, Roman. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
s
okay, let me know
any suggestions?
t
Here are a few ideas that might help make your implementation even better: • Persist Chat History: If you'd like to keep your chat history even after refreshing the page or coming back later, you could try using local storage, session storage, or a backend service to save and load messages. • Custom Formatting: Think about adding some custom touches to your messages—like highlighting system messages, making user and agent messages look different, or showing timestamps in a friendlier format. • Styling: Spruce up your chat interface with some CSS or a library like styled-components. You could even add animations when new messages pop in to make things feel more lively. • Error Handling: It’s helpful to have some error handling for when messages don’t send properly—maybe show a retry button or a clear error message so users know what’s going on. • Typing Indicator: Adding a typing indicator can really boost the real-time feel by showing when someone is typing. • Pagination: If your chat gets long, consider adding pagination or lazy loading to keep things running smoothly and easy to navigate. • Custom Hook: Creating a custom hook for your chat logic can tidy up your components and make everything easier to manage. Just let me know if you'd like me to walk you through how to add any of these features—I’m happy to help!
s
there are some custom hooks in react that my component is already using, is the full chat_ctx included in any of them? how can I retrieve the previous messages so that I can render them in the UI upon user connection?
Copy code
import { useEffect, useMemo } from 'react';
import {
  type ReceivedChatMessage,
  type TextStreamData,
  useChat,
  useRoomContext,
  useTranscriptions,
} from '@livekit/components-react';
import { transcriptionToChatMessage } from '@/lib/utils';

export default function useChatAndTranscription() {
  const transcriptions: TextStreamData[] = useTranscriptions();
  console.log('[DEBUG] useChatAndTranscription transcriptions:', transcriptions);

  const chat = useChat();
  console.log('[DEBUG] useChatAndTranscription chat:', chat);
  const room = useRoomContext();
  console.log('[DEBUG] useChatAndTranscription room:', room);

  useEffect(() => {
    console.log('[DEBUG] Transcriptions from useTranscriptions:', transcriptions);
    console.log('[DEBUG] Room state:', room?.state);
  }, [transcriptions, room]);

  // Log room context para ver identidades
  useEffect(() => {
    if (room) {
      console.log('[DEBUG] Room localParticipant:', room.localParticipant?.identity);
      console.log('[DEBUG] Room remoteParticipants:',
        Array.from(room.remoteParticipants.values()).map((p) => p.identity)
      );
    }
  }, [room]);

  const mergedTranscriptions = useMemo(() => {
    console.log('[DEBUG] Merged transcriptions:', transcriptions);
    const merged: Array<ReceivedChatMessage> = [
      ...transcriptions.map((transcription) => transcriptionToChatMessage(transcription, room)),
      ...chat.chatMessages,
    ];
    return merged.sort((a, b) => a.timestamp - b.timestamp);
  }, [transcriptions, chat.chatMessages, room]);

  // Log mensajes combinados que se renderizan
  //useEffect(() => {
  //  console.log('[DEBUG] Merged messages (transcriptions + chat):', mergedTranscriptions);
  //}, [mergedTranscriptions]);

  // Enviar un RPC para cambiar el modo de output (voz/texto)
  const sendToggleOutput = async (payload: string) => {
    if (!room) {
      console.warn('[LiveKit] No room context available for sendToggleOutput');
      return;
    }
    const agent = Array.from(room.remoteParticipants.values())[0];
    const agentIdentity = agent?.identity;
    if (!agentIdentity) {
      console.warn('[LiveKit] No agent identity found for performRpc');
      return;
    }
    try {
      await room.localParticipant.performRpc({
        destinationIdentity: agentIdentity,
        method: 'toggle_output',
        payload,
      });
      console.log(`[LiveKit] Sent toggle_output RPC with payload: ${payload}`);
    } catch (err) {
      console.error('[LiveKit] Error sending toggle_output RPC:', err);
    }
  };

  // Enviar un RPC para cambiar el modo de input (audio on/off)
  const sendToggleInput = async (payload: string) => {
    if (!room) {
      console.warn('[LiveKit] No room context available for sendToggleInput');
      return;
    }
    const agent = Array.from(room.remoteParticipants.values())[0];
    const agentIdentity = agent?.identity;
    if (!agentIdentity) {
      console.warn('[LiveKit] No agent identity found for performRpc');
      return;
    }
    try {
      await room.localParticipant.performRpc({
        destinationIdentity: agentIdentity,
        method: 'toggle_input',
        payload,
      });
      console.log(`[LiveKit] Sent toggle_input RPC with payload: ${payload}`);
    } catch (err) {
      console.error('[LiveKit] Error sending toggle_input RPC:', err);
    }
  };

  return { messages: mergedTranscriptions, send: chat.send, sendToggleOutput, sendToggleInput };
}
t
Hey there! So, the full
chat_ctx
isn’t directly included in the
useChat
,
useRoomContext
, or
useTranscriptions
hooks based on what we know. But no worries—you can still pull together chat messages and transcriptions using these hooks like this: • `useChat`: This gives you
chatMessages
with all the received chat messages, plus a handy
send
function to send new ones. • `useTranscriptions`: Here, you get transcription data that you can map and merge with your chat messages. • `useRoomContext`: This hook lets you access the room instance and participant info, though it doesn’t include chat history directly. Looks like you’re already on the right track by merging transcriptions and chat messages into one array using
useMemo
. This way, you make sure all messages are combined and nicely sorted for your UI. If you want to save or fetch previous messages beyond the current session, you might want to think about adding a backend service or using local storage to keep that chat history safe. Just give me a shout if you want help setting that up—I’m here for you! Sources: Add chat to your React app | LiveKit Docs | packages/react/src/context/index.ts | complex-agents/teleprompter/teleprompter-frontend/hooks/use-transcriber.ts
s
yeah, but the thing is... all this seems to be useful for merging incoming messages in real time but not for retrieving previous messages, right? should I use
room.local_participant.publish_data
upon user connecting the room to send to the frontend all of the previous messages?
t
You're absolutely right that the current hooks are meant for merging incoming messages in real time, but they don't pull in previous messages. Using
room.local_participant.publish_data
to send past messages when a user connects isn't really the best approach, since it's designed for real-time data sharing between connected participants and doesn’t handle message history. To get message history working, you'd want to save chat messages in a backend database or some kind of storage. Then, when someone joins, you can grab those stored messages from the backend and send them to the frontend so they show up. If you'd like, I’d be happy to help you set this up—just let me know! Sources: Data messages | LiveKit Docs | Sending text | LiveKit Docs
s
chat_ctx is being stored now into redis. Basically, when creating a new room I'm checking if there's a context saved for that user. If there is then chat_ctx is passed in the constructor of the agent. What I'm talking about is how do I send that context to the frontend?
t
To get the
chat_ctx
stored in Redis over to the frontend, you’ll usually want to set up a custom solution. Here’s a simple way to think about it: 1. Backend API: Build an API endpoint on your backend that grabs the
chat_ctx
from Redis for a specific user or room. 2. Frontend Fetch: When someone joins the room, have the frontend ask this API for the stored
chat_ctx
. 3. Render Context: Use what you fetched to fill in the chat UI on the frontend, making sure it blends smoothly with the live messages. This method helps keep everything secure while getting the context to the frontend. If you want, I’m happy to walk you through any of these steps in more detail! Sources: livekit-agents/livekit/agents/llm/chat_context.py | client/app/actions.ts