When front end uses useChat for the chat interface...
# ask-ai
e
When front end uses useChat for the chat interface to send messages, how is the backend python getting the chat messages from the user, as when the user speaks, on_user_turn_completed is used
t
Thanks for reaching out, zeal-1. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Hi there! I wanted to walk you through how the backend Python application handles chat messages coming from the frontend's
useChat
interface. These messages are sent as text streams on the
lk.chat
topic. So, whenever a user sends a message, it travels through LiveKit's data stream system, and our backend is set up to listen by registering a handler specifically for the
lk.chat
topic. Here’s a simple example to show you how the backend can catch these incoming chat messages:
Copy code
async def on_chat_message_received(reader: rtc.TextStreamReader, participant_identity: str):
    full_text = await reader.read_all()
    print(f"Received chat message from {participant_identity}: '{full_text}'")

# Register the handler
room.register_text_stream_handler("chat", on_chat_message_received)
This way, the backend is ready to receive chat messages in real-time, whether they come from the frontend’s
useChat
or LiveKit’s built-in Chat component. I hope this helps clarify things! If you have any more questions or need further assistance, feel free to ask. Sources: Agents v0.x migration guide - Node.js | LiveKit Docs | packages/react/src/prefabs/Chat.tsx | Sending text | LiveKit Docs | Agents 0.x migration guide | LiveKit Docs