How can I filter out any LLM response text wheneve...
# ask-ai
w
How can I filter out any LLM response text whenever the LLM returns a tool call?
t
Thanks for reaching out, Alex. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Hey there! If you want to filter out LLM response text whenever a tool call pops up, here’s a simple way to do it: 1. Check for Tool Calls: When you’re looking at the LLM response, peek into the
delta
object to see if it has any
tool_calls
. That’s your sign a tool call is happening. 2. Build a Response for Tool Calls: If you spot tool calls, create a response object (like
ChatChunk
) that only includes the tool call info. For example, set the
tool_calls
array inside the
delta
object. 3. Return Tool Calls: Send back the response with just the tool call details—no extra text included. 4. Handle Text Content Separately: If there aren’t any tool calls, go ahead and process the text (maybe clean up any unnecessary tokens) and return it as its own response. Here’s a quick example to make it clearer:
Copy code
if delta.tool_calls:
    for tool in delta.tool_calls:
        # Handle each tool call and construct the response
        if tool.function.name:
            call_chunk = llm.ChatChunk(
                id=id,
                delta=llm.ChoiceDelta(
                    role="assistant",
                    tool_calls=[
                        llm.FunctionToolCall(
                            arguments=self._fnc_raw_arguments or "",
                            name=self._fnc_name or "",
                            call_id=self._tool_call_id or "",
                        )
                    ],
                ),
            )
            self._tool_call_id = self._fnc_name = self._fnc_raw_arguments = None
            return call_chunk
else:
    # Process and return the text content
    delta.content = llm_utils.strip_thinking_tokens(delta.content, thinking)
    if not delta.content:
        return None
    return llm.ChatChunk(
        id=id,
        delta=llm.ChoiceDelta(content=delta.content, role="assistant"),
    )
This way, tool calls get the spotlight without any extra text cluttering things up. If anything’s unclear or you want to dive deeper, just let me know—I’m here to help! Sources: livekit-agents/livekit/agents/voice/agent_activity.py