How do I see the llm response logged from my voice...
# ask-ai
b
How do I see the llm response logged from my voice agent pipeline ?
t
Thanks for reaching out, Abdullah. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To view the LLM response logged from your voice agent pipeline in LiveKit, you can subscribe to the metrics_collected event and use the metrics module. Here's an example implementation:
Copy code
# Import the metrics module
from livekit.agents import metrics

# Subscribe to metrics collection events
@agent.on("metrics_collected")
def _on_metrics_collected(mtrcs: metrics.AgentMetrics):
    # Use the helper function to format and log metrics based on type
    metrics.log_metrics(mtrcs)
For more comprehensive tracking, you can use the UsageCollector class to aggregate metrics over a session:
Copy code
# Create a usage collector
usage_collector = metrics.UsageCollector()

# Add metrics to the collector as they are received
@agent.on("metrics_collected")
def _on_metrics_collected(mtrcs: metrics.AgentMetrics):
    # Pass metrics to the collector for aggregation
    usage_collector.collect(mtrcs)

# Log aggregated summary at the end of the session
async def log_usage():
    summary = usage_collector.get_summary()
    <http://logger.info|logger.info>(f"Usage: {summary}")

# Add the summary logging as a shutdown callback
ctx.add_shutdown_callback(log_usage)
These steps will help you log detailed metrics about your agent's performance, including LLM responses and usage. Let me know if you need further assistance! Sources: Capturing metrics | LiveKit Docs | Capturing metrics | LiveKit Docs