Hey everyone! I'm trying to send data from Postgr...
# ask-community-for-troubleshooting
r
Hey everyone! I'm trying to send data from Postgres to Bigquery and after a LONG time, I recived the folowing error:
Caused by: com.google.cloud.bigquery.BigQueryException: Cannot query rows larger than 100MB limit.
Do you know how I can solve it?
✍️ 1
u
@[DEPRECATED] Marcos Marx turned this message into Zendesk ticket 2486 to ensure timely resolution!
s
How long did it take for the sync to fail? Is this your first time syncing from Postgres to Bigquery? It seems BigQuery puts a limit on how big your rows can be: https://cloud.google.com/bigquery/quotas#:~:text=BigQuery%20Omni.-,Maximum%20row%20size,100%20MB,-The%20maximum%20row.
r
Hey Saj! Thanks for answering my question. Let me se... 1. One table with 337K rows in 6h52m; 2. Yes, it is. I managed to sync other tables perfectly, but this one is harder; 3. Thanks for this article, Is there any way around this? I'm sorry if my questions are too basic.
u
Don't apologize, it's a great question. The only way around this is most likely to either simplify the row or partition the data so that it's below the row limit.
u
Don't apologize, it's a great question. The only way around this is most likely to either simplify the row or partition the data so that it's below the row limit.
r
So, what I did: 1. I exctract a csv file from my database and upload in GCP; 2. Create a table exactly as Airbye (even with the airbyte columns); And it worked. But, when I try to make a incremental (deduped + history) replication, Airbyte became to read all data again. Is it normal? Shouldn't it just read incrementals based on "updated_at"?
And after it the same error happend. Shouldn't Incremental (deduped + history) replication just be incrementals? Why extract all data?
s
Hey @Renan Rigo Calesso, are you using CDC with the Postgres connector? Can you share the logs of the last sync? It might be an issue with the cursor but need more info so we can reproduce/escalate the issue.