can I try gpt-5 for llm part?
# ask-ai
f
can I try gpt-5 for llm part?
t
Thanks for reaching out, Baymurat. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Absolutely, you can use GPT-5 as the LLM in LiveKit! It works smoothly with the OpenAI integration, and you can pick it when you’re setting up the LLM component. If you’d like a hand getting it all set up, just give me a shout—I’m here to help! Sources: livekit-plugins/livekit-plugins-google/livekit/plugins/google/llm.py | Large language model (LLM) integrations | LiveKit Docs
f
what are the model names?
t
Here are the GPT-5 model names you can find in LiveKit: • gpt-5gpt-5-minigpt-5-nano Feel free to pick the one that suits your needs best. And if you want, I’m here to help you choose or set up any of these models—just let me know!
f
but I am having this error: raise APIStatusError( livekit.agents._exceptions.APIStatusError: Error code: 400 - {'error': {'message': 'Your organization must be verified to stream this model. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization. If you just verified, it can take up to 15 minutes for access to propagate.', 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}} (status_code=400, request_id=req_c75002fdc02749a1bc936f65fa9a1c8a, body={'message': 'Your organization must be verified to stream this model. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization. If you just verified, it can take up to 15 minutes for access to propagate.', 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}, retryable=True) 2025-08-13 105546,777 - WARNING livekit.agents - failed to generate LLM completion, retrying in 2.0s {"llm": "livekit.plugins.openai.llm.LLM", "attempt": 3} Traceback (most recent call last): File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/llm/llm.py", line 148, in _main_task return await self._run() ^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/lib/python3.12/site-packages/livekit/plugins/openai/llm.py", line 695, in _run raise APIStatusError( livekit.agents._exceptions.APIStatusError: Error code: 400 - {'error': {'message': 'Your organization must be verified to stream this model. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization. If you just verified, it can take up to 15 minutes for access to propagate.', 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}} (status_code=400, request_id=req_c75002fdc02749a1bc936f65fa9a1c8a, body={'message': 'Your organization must be verified to stream this model. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization. If you just verified, it can take up to 15 minutes for access to propagate.', 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}, retryable=True) 2025-08-13 105549 [ ERROR] livekit.agents: Error in _inference_task Traceback (most recent call last): File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/llm/llm.py", line 148, in _main_task return await self._run() ^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/lib/python3.12/site-packages/livekit/plugins/openai/llm.py", line 695, in _run raise APIStatusError( livekit.agents._exceptions.APIStatusError: Error code: 400 - {'error': {'message': 'Your organization must be verified to stream this model. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization. If you just verified, it can take up to 15 minutes for access to propagate.', 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}} (status_code=400, request_id=req_a5cf5bef41f54aa592957cbc0dfbba9c, body={'message': 'Your organization must be verified to stream this model. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization. If you just verified, it can take up to 15 minutes for access to propagate.', 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}, retryable=True) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/utils/log.py", line 16, in async_fn_logs return await fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/voice/generation.py", line 85, in _inference_task async for chunk in llm_node: File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/voice/agent.py", line 356, in llm_node async for chunk in stream: File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/llm/llm.py", line 246, in anext raise exc # noqa: B904 ^^^^^^^^^ File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/llm/llm.py", line 157, in _main_task raise APIConnectionError( livekit.agents._exceptions.APIConnectionError: failed to generate LLM completion after 4 attempts (body=None, retryable=True) 2025-08-13 105549,526 - ERROR livekit.agents - Error in _inference_task Traceback (most recent call last): File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/llm/llm.py", line 148, in _main_task return await self._run() ^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/lib/python3.12/site-packages/livekit/plugins/openai/llm.py", line 695, in _run raise APIStatusError( livekit.agents._exceptions.APIStatusError: Error code: 400 - {'error': {'message': 'Your organization must be verified to stream this model. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization. If you just verified, it can take up to 15 minutes for access to propagate.', 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}} (status_code=400, request_id=req_a5cf5bef41f54aa592957cbc0dfbba9c, body={'message': 'Your organization must be verified to stream this model. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization. If you just verified, it can take up to 15 minutes for access to propagate.', 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}, retryable=True) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/utils/log.py", line 16, in async_fn_logs return await fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/voice/generation.py", line 85, in _inference_task async for chunk in llm_node: File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/voice/agent.py", line 356, in llm_node async for chunk in stream: File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/llm/llm.py", line 246, in anext raise exc # noqa: B904 ^^^^^^^^^ File "/opt/anaconda3/lib/python3.12/site-packages/livekit/agents/llm/llm.py", line 157, in _main_task raise APIConnectionError( livekit.agents._exceptions.APIConnectionError: failed to generate LLM completion after 4 attempts (body=None, retryable=True)
t
I don't have the answer you're looking for. You could also try asking your question: • in one of the other Slack channels or • to https://deepwiki.com/livekit/livekit_composite which is trained on all LiveKit source code If you find the answer, please post it here to help others!
f
there are the version I am using rn: - livekit-agents: 1.1.5 - livekit-plugins-openai: 1.1.5 - openai: 1.93.0