bulky-actor-93512
09/12/2025, 3:39 PMtall-airline-16756
09/12/2025, 3:39 PMtall-airline-16756
09/12/2025, 3:40 PMuser_started_speaking
, user_stopped_speaking
, agent_started_speaking
, and agent_stopped_speaking
. These are triggered through voice activity detection (VAD), which helps track when someone is talking.
• For LiveKit Agents v1.x: The events have been updated to user_state_changed
and agent_state_changed
. In these, the new_state
field tells you if someone has switched to states like speaking
or listening
.
If you'd like, I’d be more than happy to provide more details or some examples on how to implement these events. Just let me know!
Sources: Agents 0.x migration guide | LiveKit Docs | Agents v0.x migration guide - Node.js | LiveKit Docs | Events and error handling | LiveKit Docs | VoicePipelineAgent | LiveKit Docs | Turn detection and interruptions | LiveKit Docsbulky-actor-93512
09/12/2025, 3:40 PMtall-airline-16756
09/12/2025, 3:40 PM@agent.on("user_started_speaking")
def user_started_speaking():
print("User started speaking")
@agent.on("user_stopped_speaking")
def user_stopped_speaking():
print("User stopped speaking")
@agent.on("agent_started_speaking")
def agent_started_speaking():
print("Agent started speaking")
@agent.on("agent_stopped_speaking")
def agent_stopped_speaking():
print("Agent stopped speaking")
For LiveKit Agents v1.x:
@session.on("user_state_changed")
def on_user_state_changed(ev):
if ev.new_state == "speaking":
print("User started speaking")
elif ev.new_state == "listening":
print("User stopped speaking")
@session.on("agent_state_changed")
def on_agent_state_changed(ev):
if ev.new_state == "speaking":
print("Agent started speaking")
If you have any questions or need a bit more explanation, please don’t hesitate to ask—I’m here to help!