creamy-window-79763
02/28/2025, 5:46 PMgray-monkey-83525
02/28/2025, 5:46 PMmax_output_tokens
parameter is not explicitly mentioned in the context of the openai.LLM.with_azure
method. According to the LiveKit documentation, the with_azure
method does not list max_output_tokens
as a parameter. Instead, it focuses on parameters like model
, azure_endpoint
, azure_deployment
, api_version
, api_key
, and temperature
.
Would you like more detail on how to work with the available parameters or any alternative approaches?
Sources:
- LiveKit documentation 👍 0 button 👎 0 button 💡 Hint: Mention @runllm in the thread for followups.