How do I send the message fron frontend to the bac...
# ask-ai
s
How do I send the message fron frontend to the backend to voicePipeline agent, but only to the LLM and thne stream the response back ?
t
Thanks for reaching out, Insight. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To send a message from the frontend to the backend to the voicePipeline agent, specifically to the LLM, and stream the response back, you can follow this process: • User audio is transmitted from the frontend to LiveKit's edge network via WebRTC. • The audio is routed to your backend agent over low-latency connections. • The agent uses the Agents framework integration to relay the audio to OpenAI's model via WebSocket. • Speech from OpenAI is streamed back through WebSocket to the agent and then relayed to the user via WebRTC. The framework provides a
chat
method that returns an
LLMStream
object, which can be used to push text and receive LLM responses in a streaming fashion. Sources: LiveKit Docs