File "C:\Codeshastra_Projects\hamari-ai-monorepo\a...
# ask-ai
l
File "C:\Codeshastra_Projects\hamari-ai-monorepo\apps\api\agent-runner-python\venv\Lib\site-packages\litellm\litellm_core_utils\core_helpers.py", line 8, in <module> from litellm.types.llms.openai import AllMessageValues File "C:\Codeshastra_Projects\hamari-ai-monorepo\apps\api\agent-runner-python\venv\Lib\site-packages\litellm\types\llms\openai.py", line 36, in <module> from openai.types.responses.response import ( ImportError: cannot import name 'ResponseTextConfig' from 'openai.types.responses.response' (C:\Codeshastra_Projects\hamari-ai-monorepo\apps\api\agent-runner-python\venv\Lib\site-packages\openai\types\responses\response.py). Did you mean: 'ResponseFormatTextConfig'?
t
Thanks for reaching out, Pavan. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
l
yes
t
It looks like the 'ResponseTextConfig' class isn’t part of the OpenAI library. Could you share a bit more about what you're trying to do with this import? For example, are you working on streaming responses, chat completions, or something else with the OpenAI API? Also, if you don’t mind, could you double-check which version of the OpenAI library you have installed? You can do this easily by running
pip show openai
. I’m here to help once you’ve got that info! Sources: livekit-plugins/livekit-plugins-openai/livekit/plugins/openai/realtime/real...