This message was deleted.
# ask-ai
s
This message was deleted.
g
Yes, it is possible to change the LLM used in a LiveKit agent dynamically. You can set the
agent._llm
to a new LLM instance within your function. Additionally, you can return a custom
LLMStream
from the
before_llm_cb
callback with your desired LLM configuration. This approach allows you to switch the LLM without creating a new agent instance. Would you like more details on how to implement this? Sources: - LiveKit Slack Discussion - LiveKit Voice Assistant Args 👍 0 button 👎 0 button 💡 Hint: Mention @runllm in the thread for followups.
d
@millions-hair-24956 show me a code example
g
Here's a code example demonstrating how to change the LLM used by a
VoicePipelineAgent
dynamically using the
before_llm_cb
callback:
Copy code
from livekit.agents import VoicePipelineAgent
from livekit.plugins import openai

# Define a callback to change the LLM
def before_llm_cb(agent, chat_ctx):
    # Create a new LLM instance with desired configuration
    new_llm = openai.LLM(model="gpt-4o-mini", temperature=0.7)
    # Assign the new LLM to the agent
    agent._llm = new_llm
    # Optionally, return a custom LLMStream if needed
    return new_llm.chat(chat_ctx=chat_ctx)

# Initialize the VoicePipelineAgent with the callback
agent = VoicePipelineAgent(
    vad=silero.VAD.load(),
    stt=deepgram.STT(),
    llm=openai.LLM(model="gpt-3.5-turbo"),
    tts=cartesia.TTS(),
    chat_ctx=initial_ctx,
    before_llm_cb=before_llm_cb
)

# Start the agent
agent.start(room, participant)
This code sets up a
VoicePipelineAgent
and uses the
before_llm_cb
callback to change the LLM to a new instance with a different model and temperature before each LLM call. Would you like more details on any part of this code? Sources: - LiveKit Slack Discussion - LiveKit Voice Assistant Args 👍 0 button 👎 0 button 💡 Hint: Mention @runllm in the thread for followups.
d
@millions-hair-24956 Can i do this in a callable function ?
g
Yes, you can encapsulate the logic to change the LLM in a callable function. This function can be invoked whenever you need to switch the LLM. Here's an example:
Copy code
from livekit.agents import VoicePipelineAgent
from livekit.plugins import openai

# Function to change the LLM
def change_llm(agent, chat_ctx):
    # Create a new LLM instance with desired configuration
    new_llm = openai.LLM(model="gpt-4o-mini", temperature=0.7)
    # Assign the new LLM to the agent
    agent._llm = new_llm
    # Return a custom LLMStream if needed
    return new_llm.chat(chat_ctx=chat_ctx)

# Initialize the VoicePipelineAgent
agent = VoicePipelineAgent(
    vad=silero.VAD.load(),
    stt=deepgram.STT(),
    llm=openai.LLM(model="gpt-3.5-turbo"),
    tts=cartesia.TTS(),
    chat_ctx=initial_ctx,
    before_llm_cb=change_llm  # Use the function as a callback
)

# Start the agent
agent.start(room, participant)
In this example, the
change_llm
function is used as the
before_llm_cb
callback, allowing you to change the LLM dynamically before each LLM call. Would you like more details on this implementation? Sources: - LiveKit Slack Discussion - LiveKit Voice Assistant Args 👍 0 button 👎 0 button 💡 Hint: Mention @runllm in the thread for followups.
d
@millions-hair-24956 no, i talk about functioncontext, which is in fnc_ctx
g
To change the
FunctionContext
dynamically, you can modify the
fnc_ctx
property of the
VoicePipelineAgent
. Here's how you can do it within a callable function:
Copy code
from livekit.agents import VoicePipelineAgent
from livekit.plugins import openai
from livekit.agents.llm import function_context

# Define a new FunctionContext
class CustomFunctionContext(function_context.FunctionContext):
    # Define your custom functions here
    pass

# Function to change the FunctionContext
def change_function_context(agent):
    # Create a new FunctionContext instance
    new_fnc_ctx = CustomFunctionContext()
    # Assign the new FunctionContext to the agent
    agent.fnc_ctx = new_fnc_ctx

# Initialize the VoicePipelineAgent
agent = VoicePipelineAgent(
    vad=silero.VAD.load(),
    stt=deepgram.STT(),
    llm=openai.LLM(model="gpt-3.5-turbo"),
    tts=cartesia.TTS(),
    chat_ctx=initial_ctx
)

# Change the FunctionContext dynamically
change_function_context(agent)

# Start the agent
agent.start(room, participant)
In this example, the
change_function_context
function creates a new
FunctionContext
and assigns it to the agent's
fnc_ctx
property. This allows you to update the available functions dynamically. Would you like more details on this implementation? Sources: - LiveKit Voice Assistant Args 👍 0 button 👎 0 button 💡 Hint: Mention @runllm in the thread for followups.