How to execute tool from custom llm plugin?
# ask-ai
n
How to execute tool from custom llm plugin?
t
Thanks for reaching out, Tareq. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To execute a tool from a custom LLM plugin in LiveKit, you do not manually trigger the execution. Instead, when the LLM makes a function call, the LiveKit framework processes this call automatically through its internal system: • The LLM plugin receives a function (tool) call from the language model. • LiveKit parses this call, creates a
FunctionCall
object with the function name and arguments, and looks up the appropriate tool in the tool context. • The system executes the tool automatically—either as a standard
FunctionTool
(which receives unpacked arguments) or a
RawFunctionTool
(which receives raw JSON arguments), depending on how the tool was defined. No manual intervention is needed at the execution stage; you simply register your tools using the
@function_tool
decorator, and LiveKit handles invocation whenever the LLM issues a tool call. Each LLM plugin (OpenAI, Anthropic, AWS, etc.) implements this tool execution integration with built-in error handling. For more detailed implementation guidelines or troubleshooting, refer to the internal tool context and execution logic within your plugin configuration. Sources: livekit-agents/livekit/agents/llm/tool_context.py | Tool definition and use | LiveKit Docs