I want to send extra kwargs to the LLM client. I h...
# ask-ai
a
I want to send extra kwargs to the LLM client. I have initialized them in my main file. How can I send these extra params?
t
Thanks for reaching out, Aditya. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
To send extra kwargs to the LLM client, you can follow these steps: 1. Collect the extra parameters in an
extra
dictionary. For example:`extra = {"temperature": 0.7,"max_output_tokens": 100,"top_p": 0.9}` 2. Pass this
extra
dictionary to the LLMStream constructor or directly to the API call of the LLM client. For instance, in the OpenAI implementation, you can use:`self._oai_stream = stream = await self._client.chat.completions.create(messages=chat_ctx,tools=fnc_ctx,model=self._model,stream_options={"include_usage": True},stream=True,timeout=httpx.Timeout(self._conn_options.timeout),**extra)` 3. Ensure that the parameters in the
extra
dictionary are supported by the LLM provider you are using (e.g., OpenAI, Google, Mistral). Each provider may have specific parameters like
temperature
,
max_output_tokens
,
top_p
, etc. Let me know if you need further clarification or assistance! Sources: GitHub::livekit::agents | livekit-plugins/livekit-plugins-openai/livekit/pl...
a
how to collect this extra dictionary?
t
Hey there! 👋 Let me walk you through collecting that extra dictionary for your LLM client – it's actually pretty straightforward! 1. First up, you'll want to start with an empty dictionary:
Copy code
python extra = {}
2. Then, you can check for configuration options using something like
is_given()
. Here's how that might look:
Copy code
python if is_given(self._opts.temperature): extra["temperature"] = self._opts.temperature if is_given(self._opts.max_output_tokens): extra["max_output_tokens"] = self._opts.max_output_tokens
3. The key here is only adding parameters that are actually provided or valid. This keeps things clean and prevents any unnecessary clutter. 4. When you're ready, just pass this
extra
dictionary to your LLMStream constructor or API call, and you're all set! Stuck on any part of this? No worries! I'm happy to break it down further or help you troubleshoot. Just give me a shout if you need anything else! 😊 Sources: GitHub::livekit::agents | livekit-plugins/livekit-plugins-openai/livekit/pl...