Why do `self.session.say()` or `self.session.gener...
# ask-ai
s
Why do
self.session.say()
or
self.session.generate_reply()
not work for me inside the
llm_node
? I do not see any errors logged either.
t
Thanks for reaching out, Jawad. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Could you provide more details about your setup? For example, are you using a specific version of LiveKit, and could you share the context or code snippet where you are trying to use
self.session.say()
or
self.session.generate_reply()
inside the
llm_node
? This will help in identifying the issue.
s
I am using LiveKit version 1.0.9. Here is my code:
Copy code
async def llm_node(self, chat_ctx, tools, model_settings):
        """
        LangGraph entrypoint for LLM response generation.

        Extracts latest user message and full chat history,
        loads or initializes LangGraph state, and invokes the agent graph.

        Args:
            chat_ctx (ChatContext): Conversation history context
            tools (list): Optional tool definitions
            model_settings (ModelSettings): LLM settings

        Returns:
            str: AI response string
        """
        from asgiref.sync import sync_to_async

        if not hasattr(self, "graph"):
            session = await sync_to_async(AgentSession.objects.get)(session_id=self.thread_id)

            agent = await sync_to_async(lambda: session.agent.enabled_agents.first())()
            self.agent_obj = agent

            self.graph = await sync_to_async(lambda: ActiveAgentBuilder(agent).build())()
            self.agent_obj = agent

        user_messages = [
            item.content[0] if isinstance(item.content, list) else item.content
            for item in chat_ctx.items
            if item.role == "user"
        ]

        latest_user_message = user_messages[-1] if user_messages else ""

        state = await sync_to_async(load_or_bootstrap_state)(self.thread_id, self.agent_obj)
        state["last_user_message"] = latest_user_message

        state = await sync_to_async(
            lambda: self.graph.invoke(input=state, config={"configurable": {"thread_id": self.thread_id}})
        )()

        last_ai_message = state.get("last_ai_message")

        if state.get("end_call"):
            print(f"END CALL INVOKED")
            await self.session.say("Thank you for the conversation. Goodbye!")
            await asyncio.sleep(6)
            await self.session.drain()
            await self.session.aclose()

        return f"{last_ai_message}"
t
I don't have the answer you're looking for. You could also try asking your question: • in one of the other Slack channels or • to https://deepwiki.com/livekit/livekit_composite which is trained on all LiveKit source code If you find the answer, please post it here to help others!