Does LiveKit provide media over websocket?
# ask-ai
f
Does LiveKit provide media over websocket?
t
Thanks for reaching out, Ayush. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Hey there! Just wanted to share how LiveKit handles media over WebSocket in a few key situations: • Track Egress: Audio can be sent over WebSocket as raw PCM (
pcm_s16le
) using binary frames. Plus, you'll get JSON notifications whenever there's a change in track state, like mute or unmute. • Agents Framework: Here, WebSocket helps with server-to-server chats, like with OpenAI's Realtime API. But when it comes to sending media to users, LiveKit switches to WebRTC to keep things smooth and fast. • Client SDKs: WebSocket is great for signaling and sending data, but it’s not the main way media gets delivered to end-users. So, for most media streaming—especially video—LiveKit relies on WebRTC to make sure you get low latency and top-notch performance. Hope this clarifies things! Sources: Universal Egress | Universal Egress | LiveKit Docs | LiveKit Docs | lib/src/support/websocket.dart