can livekit work with deepseek locally
# ask-ai
r
can livekit work with deepseek locally
t
Thanks for reaching out, Hrithik. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Yes, LiveKit can work with DeepSeek locally. To integrate DeepSeek with LiveKit Agents, follow these steps: 1. Install the required package:
pip install "livekit-agents[openai]~=1.0"
2. Set your DeepSeek API key in your
.env
file:
DEEPSEEK_API_KEY=<your-deepseek-api-key>
3. Use the OpenAI plugin's
with_deepseek
method to configure DeepSeek as your LLM:
Copy code
from livekit.plugins import openai

deepseek_llm = openai.LLM.with_deepseek(
    model="deepseek-chat",  # this is DeepSeek-V3
    temperature=0.7
)
You can also customize parameters like: • model: The DeepSeek model to use (default is "deepseek-chat"). • temperature: Controls output randomness (0-2, default is 1.0). • parallel_tool_calls: Enables multiple simultaneous tool calls. • tool_choice: Controls how the model uses tools (auto/required/none). If you need further assistance, feel free to ask! Sources: DeepSeek LLM integration guide | LiveKit Docs
r
I want to download deepseek model in my local pc not use any api keys
t
Currently, the available documentation and resources indicate that DeepSeek models are accessed via their API, which requires an API key. There doesn't appear to be an option to download the DeepSeek model directly for local use without using API keys. If you have further questions or need assistance with API-based integration, feel free to let me know! Sources: livekit.plugins.openai API documentation