Send me i want you to send me updated new boilder...
# ask-ai
w
Send me i want you to send me updated new boilder code for agents 1.0 code which has replacement for the above things that mentioned chat_ctx=initial_ctx, before_llm_cb=before_llm_cb, before_tts_cb=before_tts_cb, allow_interruptions=True
t
Thanks for reaching out, Atharva. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Here is the updated boilerplate code for Agents 1.0 with replacements for the parameters you mentioned:
Copy code
from livekit.agents import JobContext, llm, AgentSession
from livekit.plugins import (
    cartesia,
    deepgram,
    google,
    silero,
)

async def entrypoint(ctx: JobContext):
    # Replace chat_ctx=initial_ctx with system prompt in AgentSession
    agent = AgentSession(
        vad=silero.VAD.load(),
        stt=deepgram.STT(),
        llm=google.LLM(),
        tts=cartesia.TTS(),
        system_prompt="You are a helpful voice AI assistant.",  # Replaces chat_ctx=initial_ctx
        # allow_interruptions=True is likely still supported but not explicitly shown in migration example
    )

    await agent.start(room, participant)
    await agent.say("Hey, how can I help you today?", allow_interruptions=True)
For the callbacks: - before_llm_cb is replaced with llm_node - before_tts_cb is replaced with tts_node These nodes provide more flexibility for customizing the pipeline behavior. Let me know if you need further assistance! Sources: Agents 0.x migration guide | LiveKit Docs
w
in this code can you also have alternatvive to before_llm_cb=before_llm_cb, before_tts_cb=before_tts_cb, allow_interruptions=True. class MyAgent(Agent): async def tts_node(self, text: AsyncIterable[str], model_settings: ModelSettings): # use default implementation, but pre-process the text return Agent.default.tts_node(self, tokenize.utils.replace_words(text), model_settings),, class MyAgent(Agent): # override method from superclass to customize behavior async def llm_node( self, chat_ctx: llm.ChatContext, tools: list[llm.FunctionTool], model_settings: ModelSettings, ) -> AsyncIterable[llm.ChatChunk]:: rag_context: str = retrieve(chat_ctx) chat_ctx.add_message(content=rag_context, role="system") # update the context for persistence # await self.update_chat_ctx(chat_ctx) return Agent.default.llm_node(self, chat_ctx, tools, model_settings) Chat context ChatContext has been overhauled in 1.0 to provide a more powerful and flexible API for managing chat history. It now accounts for differences between LLM providers—such as stateless and stateful APIs—while exposing a unified interface. Chat history can now include three types of items: • `ChatMessage`: a message associated with a role (e.g., user, assistant). Each message includes a list of
content
items, which can contain text, images, or audio. • `FunctionCall`: a function call initiated by the LLM. • `FunctionCallOutput`: the result returned from a function call. Updating chat context In 0.x, updating the chat context required modifying chat_ctx.messages directly. This approach was error-prone and difficult to time correctly, especially with realtime APIs. In v1.0, there are two supported ways to update the chat context: • Agent handofftransferring control to a new agent, which will have its own chat context. • Explicit update - calling
agent.update_chat_ctx()
to modify the context directly. Transcriptions Agents 1.0 brings some new changes to how transcriptions are handled: • Transcriptions now use text streams with topic
lk.transcription
. • The old transcription protocol is deprecated and will be removed in v1.1. • for now both protocols are used for backwards compatibility. • Upcoming versions SDKs/components standardize on text streams for transcriptions. Accepting text input Agents 1.0 introduces improved support for text input. Previously, text had to be manually intercepted and injected into the agent via
ChatManager
. In this version, agents automatically receive text input from a text stream on the
lk.chat
topic. The
ChatManager
has been removed in Python SDK v1.0.
t
I'm sorry, but I don't have the information you need at the moment. If you would like a member of the LiveKit team to respond, please ask your question in one of the other channels.
w
Send me i want you to send me updated new boilder code for agents 1.0 code which has replacement for the above things that mentioned chat_ctx=initial_ctx, before_llm_cb=before_llm_cb, before_tts_cb=before_tts_cb, allow_interruptions=True