steep-balloon-41261
07/25/2025, 5:46 PMtall-airline-16756
07/25/2025, 5:46 PMtall-airline-16756
07/25/2025, 5:47 PMbase_url
parameter. Here's an example: `from livekit.plugins import openai
# Create LLM with custom endpoint
llm = openai.LLM(
model="your-model-name",
api_key="your-api-key", # or set to a placeholder if not needed
base_url="[your-custom-endpoint]/v1"
)`This approach works seamlessly if your endpoint adheres to the OpenAI API specification.
• Implement a Custom LLM Node: For completely custom LLM implementations that don’t follow the OpenAI API format, you can override the LLM node. This allows you to: You can find more details on implementing a custom LLM node in the LiveKit documentation on LLM nodes.
◦ Customize how the LLM is used
◦ Modify the chat context prior to inference
◦ Adjust how tool invocations and responses are handled
◦ Implement a custom LLM provider without a plugin
If your custom endpoint is OpenAI-compatible, the first approach is the simplest. Otherwise, you can explore the second option for full customization.
Sources: Voice pipeline nodes | LiveKit Docs | Large language model (LLM) integrations | LiveKit Docs | LiveKit Docs | livekit-agents/livekit/agents/llm/fallback_adapter.pytall-airline-16756
07/25/2025, 5:49 PM