How can the bootstrapped voice assistant frontend ...
# ask-ai
i
How can the bootstrapped voice assistant frontend be made to show a transcript of a realtime interaction?
t
Thanks for reaching out, Matthew. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To display a transcript of a realtime interaction in your bootstrapped voice assistant frontend, LiveKit's Agents framework supports delivering realtime transcriptions during voice interactions. Here's how it works: • User speech transcriptions are delivered as soon as they are processed by the Speech-to-Text (STT) system. • Agent text responses are synchronized with audio playback. If you're using
VoicePipelineAgent
or
MultimodalAgent
, transcriptions are automatically forwarded to clients when you implement the necessary frontend support. For custom agents, you can use the
STTSegmentsForwarder
class to forward transcriptions to your frontend in realtime. Sources: AI voice agents | LiveKit Docs | Transcriptions | LiveKit Docs