<@U07FRRZH2GN> can you tell em if i can use openRo...
# ask-ai
f
@icy-policeman-64898 can you tell em if i can use openRouter API for LLMs
t
Thanks for reaching out, Abhishek. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Thanks for reaching out! While LiveKit doesn’t have a direct integration documented for the OpenRouter API for LLMs, it does support OpenAI-compatible APIs through its OpenAI plugin. This means you can connect with third-party LLM providers that follow the OpenAI API specs, like OpenRouter. To get started with integrating OpenRouter and LiveKit, you might want to try: • Setting your OpenRouter API key, for example, using the
OPENAI_API_KEY
environment variable. • Configuring the
base_url
parameter in the
openai.LLM
constructor to
<https://openrouter.ai/api/v1>
. • Making sure the model names you use match those supported by OpenRouter. If you’d like more help or want to see examples from other users, the LiveKit community is super welcoming and active! You can join them on Slack or check out their GitHub. We’re here to support you every step of the way! Sources: LiveKit integration guides | LiveKit Docs | livekit-plugins/livekit-plugins-openai/README.md | OpenAI compatible LLMs | LiveKit Docs | Ollama integration guide | LiveKit Docs