wooden-librarian-60399
08/06/2025, 8:55 PMtall-airline-16756
08/06/2025, 8:56 PMtall-airline-16756
08/06/2025, 8:56 PMtool_choice
parameter in your LLM integration. Here's how you can do it:
1. Set up the LLM Integration: Include the tool_choice
parameter and set it to 'required'
. For example:
from livekit.plugins import openai
session = AgentSession(
llm=openai.LLM.with_x_ai(
model="grok-2-public",
temperature=1.0,
tool_choice='required'
),
# ... other configurations ...
)
1. Configure the Pipeline Agent: Ensure the LLM integration is properly configured with the tool_choice
parameter. For example:
from livekit.plugins import openai
agent = VoicePipelineAgent(
vad,
stt,
llm,
tts,
{
# ... other options ...
beforeLLMCallback: (agent, chatCtx) => {
return agent.llm.chat({
chatCtx,
fncCtx: agent.fncCtx,
tool_choice: 'required', // Force tool usage
});
}
}
);
The tool_choice
parameter is supported by multiple LLM integrations, including Fireworks, Together AI, and Perplexity. It allows you to control tool usage with options like:
• `'auto'`: Let the model decide (default).
• `'required'`: Force tool usage.
• `'none'`: Disable tool usage.
Let me know if you need further assistance!
Sources: xAI LLM integration guide | LiveKit Docs | Fireworks AI LLM integration guide | LiveKit Docs | livekit-plugins/livekit-plugins-aws/livekit/plugins/aws/experimental/realti... | Perplexity LLM integration guide | LiveKit Docs