limited-spring-14213
09/09/2025, 10:25 PMtall-airline-16756
09/09/2025, 10:25 PMtall-airline-16756
09/09/2025, 10:26 PMllm_node
method in your custom Agent class. This way, you can take a closer look at the chat_ctx
, tools
, and model_settings
before the API call happens. Here's a simple example to guide you:
from livekit.agents import Agent, llm, ModelSettings
class MyCustomAgent(Agent):
async def llm_node(
self,
chat_ctx: llm.ChatContext,
tools: list[llm.FunctionTool],
model_settings: ModelSettings,
):
# Log the full data being passed to the LLM
print("Chat Context:", chat_ctx)
print("Tools:", [tool.name for tool in tools])
print("Model Settings:", model_settings)
# Delegate to default implementation
async for chunk in Agent.default.llm_node(self, chat_ctx, tools, model_settings):
yield chunk
Also, if you'd like a more detailed look, you can enable debug logging by setting the environment variable LK_OPENAI_DEBUG=1
. This will give you insights into key parts of the request, like the converted chat_ctx
, tool_choice
, and the function calling context.
If you're interested in taking full control and customizing the LLM interaction, you can skip the default method and call self.llm.chat()
directly inside llm_node
. Please feel free to reach out if you'd like me to walk you through that or if you have any other questions—I'm here to help!
Sources: Docs::Agents | livekit-plugins/livekit-plugins-openai/livekit/plugins/openai/llm.py | pipeline-llm/replacing_llm_output.py | Modifying LLM output before TTS | LiveKit Docs | Agents 0.x migration guide | LiveKit Docslimited-spring-14213
09/09/2025, 10:32 PMtall-airline-16756
09/09/2025, 10:33 PMLK_OPENAI_DEBUG
environment variable just like other environment variables used by LiveKit, such as OPENAI_API_KEY
. Here are a few friendly options to get you started:
• In a .env
or .env.local
file: Simply add this line:
LK_OPENAI_DEBUG=1
• Directly in your shell: You can run this command:
export LK_OPENAI_DEBUG=1
• In a Docker Compose file: Just include it under the environment
section of your service:
services:\n app:\n environment:\n - LK_OPENAI_DEBUG=1
If you’d like, I’m here to help you set this up in your specific environment—just let me know!
Sources: livekit-plugins/livekit-plugins-openai/README.md | README.md