can i use "gemini-2.5-flash" with the google.llm()...
# ask-ai
m
can i use "gemini-2.5-flash" with the google.llm() ?
t
Thanks for reaching out, Jack. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Hey there! It looks like the model "gemini-2.5-flash" isn’t directly listed as a supported option for
google.llm()
. But don’t worry—there are some related models like "gemini-2.5-pro-preview-05-06" and "gemini-2.5-flash-preview-04-17" that are compatible. You might want to give one of those a try instead!