Hello Everyone, I'm trying to ingest Snowflake met...
# all-things-deployment
m
Hello Everyone, I'm trying to ingest Snowflake metadata (I have my datahub setup in Kubernetes Cluster) and while doing so I'm getting storage error (
Copy code
ERROR: The ingestion process was killed, likely because it ran out of memory. You can resolve this issue by allocating more memory to the datahub-actions container.
When I go through the values.yml file, I could see that the datahub-actions container has 512 Mi as memory. My questions is, when we ingest metadata, in which container it will be stored. If the data from snowflake we are trying to ingest is in GBs, how large we have to scale our memory in the actions container? Is there a way to find out what is the size of the data/metadata we are trying to ingest(from snowflake/any other source). Can someone help me with this.
🔍 1
1
📖 1
l
This content can't be displayed.
h