delightful-airline-96880
02/25/2025, 1:24 PMself._llm.emit("metrics_collected", metrics)
How to catch and print them in voice pipeline agent?gray-monkey-83525
02/25/2025, 1:24 PMllm
class in a VoicePipelineAgent, you can subscribe to the metrics_collected
event. Here's a basic example of how you can achieve this:
1. Subscribe to the Event: Use the on
method of the agent to listen for the metrics_collected
event.
2. Log the Metrics: Define a callback function that logs or prints the metrics when the event is triggered.
Here's a code snippet to illustrate this:
from livekit.agents.pipeline import VoicePipelineAgent
from livekit.plugins import openai, deepgram, silero
# Initialize the agent
agent = VoicePipelineAgent(
vad=silero.VAD.load(),
stt=deepgram.STT(),
llm=openai.LLM(),
tts=openai.TTS(),
)
# Define a callback function to handle the metrics
def on_metrics_collected(metrics):
print(f"Metrics collected: {metrics}")
# Subscribe to the metrics_collected event
agent.on("metrics_collected", on_metrics_collected)
# Start the agent (assuming room and participant are defined)
agent.start(room, participant)
This code sets up a listener for the metrics_collected
event and prints the metrics whenever they are emitted by the llm
class.
Would you like more details on any specific part of this process?
Sources:
- VoicePipelineAgent
- Capturing metrics 👍 0 button 👎 0 button 💡 Hint: Mention @runllm in the thread for followups.delightful-airline-96880
02/25/2025, 1:27 PMmetrics = LLMMetrics(
timestamp=time.time(),
request_id=request_id,
ttft=ttft,
duration=duration,
cancelled=self._task.cancelled(),
label=self._llm._label,
completion_tokens=usage.completion_tokens if usage else 0,
prompt_tokens=usage.prompt_tokens if usage else 0,
total_tokens=usage.total_tokens if usage else 0,
tokens_per_second=usage.completion_tokens / duration if usage else 0.0,
error=None,
)
print("AFTER METTICS LLM METRICS", metrics)
self._llm.emit("metrics_collected", metrics)
I see info like
AFTER METTICS LLM METRICS LLMMetrics(request_id='557cbc79-c20c-4daa-b8fc-586623d9b4e1', timestamp=1740489787.082572, ttft=0.5034498330205679, duration=1.0604407500941306, label='livekit.plugins.openai.llm.LLM', cancelled=False, completion_tokens=46, prompt_tokens=71, total_tokens=117, tokens_per_second=43.37818967812844, error=None)
I also have in my entrypoint:
@assistant.on("metrics_collected")
def _on_metrics_collected(mtrcs: metrics.AgentMetrics):
print("metrics_collected catched", mtrcs)
But _on_metrics_collected doesnt show llm metrics. It shows vad and stt onlygray-monkey-83525
02/25/2025, 1:27 PMmetrics_collected
event is being emitted, but your callback might not be capturing the LLM metrics specifically. This could be due to the way the event is being handled or the specific type of metrics being emitted.
To ensure that your callback captures all types of metrics, including LLM metrics, you might want to check the following:
1. Ensure Correct Event Subscription: Make sure that the @assistant.on("metrics_collected")
decorator is correctly set up to listen for all types of metrics, including LLM metrics.
2. Check Metric Types: In your callback function, you can add a check to differentiate between different types of metrics. This will help you confirm if the LLM metrics are being emitted but not printed due to filtering or other logic.
Here's an updated version of your callback function that includes a type check:
@assistant.on("metrics_collected")
def _on_metrics_collected(mtrcs: metrics.AgentMetrics):
if isinstance(mtrcs, metrics.LLMMetrics):
print("LLM metrics collected:", mtrcs)
else:
print("Other metrics collected:", mtrcs)
This code will print LLM metrics separately from other types of metrics, helping you verify if they are being emitted and captured correctly.
Would you like more details on any specific part of this process?
Sources:
- https://livekit-users.slack.com/archives/C07FY8WHGPM/p1730233417.503479 👍 0 button 👎 0 button 💡 Hint: Mention @runllm in the thread for followups.delightful-airline-96880
02/25/2025, 1:31 PMgray-monkey-83525
02/25/2025, 1:31 PM