Hi Guys, I was configuring WrenAI with Ollama usi...
# general
k
Hi Guys, I was configuring WrenAI with Ollama using docker in windows platform. Everything seems to be fine but at last for generating the questions I'm getting as below. Can someone help. I have sufficient amount of memory. 2025-01-28 164207 ------------------------------------------------------------------- 2025-01-28 164207 2025-01-28 164207 E0128 111207.745 32 wren-ai-service:60] An error occurred during question recommendation generation: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'model requires more system memory (40.8 GiB) than is available (3.5 GiB)', 'type': 'api_error', 'param': None, 'code': None}} 2025-01-28 164207 INFO: 172.18.0.5:37970 - "GET /v1/question-recommendations/efe80dc2-8009-42a9-959d-b5319a524d16 HTTP/1.1" 200 OK Regards
s
@Keshav Ravi this question is better targeted at the wrenai community. Wrenai is built on hamilton, but unfortunately we can’t help you much here.
k
@Stefan Krawczyk Sure. Thank you.