5-08-08 10:28:12,580 - DEBUG livekit.agents - init...
# ask-ai
b
2025-08-08 102812,580 - DEBUG livekit.agents - initializing inference runner {"runner": "lk_end_of_utterance_multilingual", "pid": 336, "inference": true} None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. 2025-08-08 102816,408 - INFO livekit.agents - killing process {"pid": 336, "inference": true} 2025-08-08 102816,409 - ERROR livekit.agents - worker failed Traceback (most recent call last): File "/usr/local/lib/python3.11/asyncio/tasks.py", line 500, in wait_for return fut.result() ^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/livekit/agents/ipc/channel.py", line 47, in arecv_message return _read_message(await dplx.recv_bytes(), messages) ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/livekit/agents/utils/aio/duplex_unix.py", line 35, in recv_bytes len_bytes = await self._reader.readexactly(4) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/asyncio/streams.py", line 750, in readexactly await self._wait_for_data('readexactly') File "/usr/local/lib/python3.11/asyncio/streams.py", line 543, in _wait_for_data await self._waiter asyncio.exceptions.CancelledError The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/livekit/agents/cli/_run.py", line 79, in _worker_run await worker.run() File "/usr/local/lib/python3.11/site-packages/livekit/agents/worker.py", line 402, in run await self._inference_executor.initialize() File "/usr/local/lib/python3.11/site-packages/livekit/agents/ipc/supervised_proc.py", line 169, in initialize init_res = await asyncio.wait_for( ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/asyncio/tasks.py", line 502, in wait_for raise exceptions.TimeoutError() from exc TimeoutError
t
Thanks for reaching out, Sandeep. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.