how do I make a pipeline agent whose LLM has tool ...
# ask-ai
w
how do I make a pipeline agent whose LLM has tool use set to "required"?
t
Thanks for reaching out, Drew. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To create a pipeline agent with the LLM tool use set to "required," you need to configure the
tool_choice
parameter in your LLM integration. Here's how you can do it: 1. Set up the LLM Integration: Include the
tool_choice
parameter and set it to
'required'
. For example:
Copy code
from livekit.plugins import openai

session = AgentSession(
    llm=openai.LLM.with_x_ai(
        model="grok-2-public",
        temperature=1.0,
        tool_choice='required'
    ),
    # ... other configurations ...
)
1. Configure the Pipeline Agent: Ensure the LLM integration is properly configured with the
tool_choice
parameter. For example:
Copy code
from livekit.plugins import openai

agent = VoicePipelineAgent(
    vad,
    stt,
    llm,
    tts,
    {
        # ... other options ...
        beforeLLMCallback: (agent, chatCtx) => {
            return agent.llm.chat({
                chatCtx,
                fncCtx: agent.fncCtx,
                tool_choice: 'required', // Force tool usage
            });
        }
    }
);
The
tool_choice
parameter is supported by multiple LLM integrations, including Fireworks, Together AI, and Perplexity. It allows you to control tool usage with options like: • `'auto'`: Let the model decide (default). • `'required'`: Force tool usage. • `'none'`: Disable tool usage. Let me know if you need further assistance! Sources: xAI LLM integration guide | LiveKit Docs | Fireworks AI LLM integration guide | LiveKit Docs | livekit-plugins/livekit-plugins-aws/livekit/plugins/aws/experimental/realti... | Perplexity LLM integration guide | LiveKit Docs