why the useChat hook doesn’t stream responses from...
# ask-ai
m
why the useChat hook doesn’t stream responses from the ai bot in the room?
t
Thanks for reaching out, Paolo. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
The
useChat
hook in LiveKit is intended for real-time text streaming, but it does not natively stream AI bot responses by itself. This is because: • It does not persist message history by default, so messages may be lost on refresh. • Streaming AI bot responses require separate handling beyond the basic
useChat
hook functions (
chatMessages
array and
send
function). • Integration with AI bots typically uses the LLM plugin's
chat
method, which returns an
LLMStream
for real-time response streaming. If not implemented correctly, responses won't stream live. To enable streaming from your AI bot, you may need to implement custom logic to handle the streaming data from the bot and pass it through LiveKit's text streams. Also, check your network and server configuration to ensure streaming data is properly transmitted. If you have specific errors or details on your implementation, please share them for more tailored advice. Sources: Sending text | LiveKit Docs | pkg/service/errors.go | useChat | React Components | LiveKit Docs | Add chat to your React app | LiveKit Docs | LLM | LiveKit Agents | livekit-plugins/livekit-plugins-openai/livekit/plugins/openai/llm.py