can a function tool call create an agent task?
# ask-ai
p
can a function tool call create an agent task?
t
Thanks for reaching out, harish. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Hi there! I wanted to share that in LiveKit, a function tool call can help you initiate a handoff to another agent—this is actually a key feature of the system. Right now, creating an "agent task" is still experimental and available only in Python, not Node.js. If you're looking to transfer control to another agent, you can do this by defining a tool that returns a handoff object. Please feel free to reach out if you'd like me to walk you through the details or provide some examples—I’m here to help! Sources: Tool definition and use | LiveKit Docs | agents/src/voice/agent_activity.ts | examples/voice_agents/silent_function_call.py | Workflows | LiveKit Docs
p
I want to know if I can create an agent task inside a function tool call?
t
Thanks for reaching out! Creating an agent task directly inside a function tool call is a bit of an experimental feature right now and is only supported in Python. I understand that many LiveKit implementations use Node.js, where this feature isn’t available just yet. If you’re working with Python and would like some help navigating this experimental feature, please don’t hesitate to ask—I’m here to guide you through it. For those using Node.js, a great alternative is to use function tool calls to start agent handoffs, and I’d be happy to assist with that as well!
p
give an example in python
t
I'd love to walk you through an example of creating an agent task inside a function tool call in Python for LiveKit:
Copy code
from livekit.agents import Agent, AgentTask, dataclass, function_tool
import random

@dataclass
class RandomResult:
    random_number: int

class InlineAgent(AgentTask[RandomResult]):
    def __init__(self, *, oneshot: bool) -> None:
        super().__init__(instructions="You are a voice assistant")
        self._oneshot = oneshot

    async def on_enter(self) -> None:
        if self._oneshot:
            # Immediately trigger the tool call when the task starts
            self.session.generate_reply(instructions="Call the generate_number tool")

    @function_tool
    async def generate_number(self, ctx: RunContext):
        """Generate a random number and complete the task"""
        # Complete the task by calling self.complete() with the result
        self.complete(RandomResult(random_number=random.randint(1, 100)))
        return None

class AshAgent(Agent):
    def __init__(self, *, oneshot: bool) -> None:
        super().__init__(instructions="Your name is Ash. You would interact with users via voice.")
        self._oneshot = oneshot

    @function_tool
    async def start_random_generator(self, ctx: RunContext):
        """Get a random number by starting the random generator task"""
        # Create and await the task within the function tool
        random_result = await InlineAgent(oneshot=self._oneshot)
        return random_result.random_number
This example gently shows how to define an
AgentTask
(InlineAgent) that generates a random number and completes itself. Then, another agent (AshAgent) calls this task using a function tool call. If you have any questions or would like me to provide more examples, please feel free to ask—I’m here to help! Sources: Tool definition and use | LiveKit Docs | Workflows | LiveKit Docs | tests/test_evals.py | Workflows | LiveKit Docs
p
is there a timeout for tool calls?
t
Yes, tool calls in LiveKit can have timeouts, but there isn’t a single global timeout for all tool calls. Instead, the timeouts depend on the specific integration or plugin you’re working with: • LLM HTTP Request Timeout: For plugins like OpenAI, the default read timeout is 5 seconds. The good news is, this can be customized in some plugins to better fit your needs. • MCP Server Timeout: The default timeout for client sessions communicating with the MCP server is also 5 seconds. • Custom Timeouts: Developers have the flexibility to set their own timeouts for longer operations, which helps provide useful feedback to users and manage execution smoothly. If you’d like, I’m here to help you configure or manage timeouts for your specific use case—just let me know! Sources: tools/server/tests/utils.py | External data and RAG | LiveKit Docs | External data and RAG | LiveKit Docs | plugins/test/src/llm.ts | livekit-plugins/livekit-plugins-openai/livekit/plugins/openai/llm.py