steep-keyboard-85291
07/07/2025, 7:50 AMself.session.say()
or self.session.generate_reply()
not work for me inside the llm_node
? I do not see any errors logged either.tall-airline-16756
07/07/2025, 7:50 AMtall-airline-16756
07/07/2025, 7:50 AMself.session.say()
or self.session.generate_reply()
inside the llm_node
? This will help in identifying the issue.steep-keyboard-85291
07/07/2025, 7:52 AMasync def llm_node(self, chat_ctx, tools, model_settings):
"""
LangGraph entrypoint for LLM response generation.
Extracts latest user message and full chat history,
loads or initializes LangGraph state, and invokes the agent graph.
Args:
chat_ctx (ChatContext): Conversation history context
tools (list): Optional tool definitions
model_settings (ModelSettings): LLM settings
Returns:
str: AI response string
"""
from asgiref.sync import sync_to_async
if not hasattr(self, "graph"):
session = await sync_to_async(AgentSession.objects.get)(session_id=self.thread_id)
agent = await sync_to_async(lambda: session.agent.enabled_agents.first())()
self.agent_obj = agent
self.graph = await sync_to_async(lambda: ActiveAgentBuilder(agent).build())()
self.agent_obj = agent
user_messages = [
item.content[0] if isinstance(item.content, list) else item.content
for item in chat_ctx.items
if item.role == "user"
]
latest_user_message = user_messages[-1] if user_messages else ""
state = await sync_to_async(load_or_bootstrap_state)(self.thread_id, self.agent_obj)
state["last_user_message"] = latest_user_message
state = await sync_to_async(
lambda: self.graph.invoke(input=state, config={"configurable": {"thread_id": self.thread_id}})
)()
last_ai_message = state.get("last_ai_message")
if state.get("end_call"):
print(f"END CALL INVOKED")
await self.session.say("Thank you for the conversation. Goodbye!")
await asyncio.sleep(6)
await self.session.drain()
await self.session.aclose()
return f"{last_ai_message}"
tall-airline-16756
07/07/2025, 7:52 AM