*Is this your first time deploying Airbyte*: No *O...
# troubleshooting
k
Is this your first time deploying Airbyte: No OS Version / Instance: EC2 Docker-compose Memory / Disk: 16GB RAM / 4GB CPU t3-xlarge Deployment: Docker Airbyte Version: 0_.30.15-alpha_ I am running a custom source connector Salesforce Marketing cloud with destination Snowflake and getting the following timeout error:
Copy code
2022-03-10 08:35:13 INFO () DefaultAirbyteStreamFactory(internalLog):90 - Done retrieving results from 'sent' endpoint
2022-03-10 08:35:13 INFO () DefaultAirbyteStreamFactory(internalLog):90 - Updating state.
2022-03-10 08:35:13 INFO () DefaultAirbyteStreamFactory(internalLog):90 - Fetching sent from 2022-03-09T12:00:00Z to 2022-03-09T12:30:00Z
2022-03-10 08:35:13 INFO () DefaultAirbyteStreamFactory(internalLog):90 - Making RETRIEVE call to 'sent' endpoint with filters '{'Property': 'EventDate', 'SimpleOperator': 'between', 'Value': ['2022-03-09T12:00:00Z', '2022-03-09T12:30:00Z']}'.
2022-03-10 08:35:13 ERROR () DefaultAirbyteStreamFactory(internalLog):88 - Request failed with 'Error: Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding.'
2022-03-10 08:35:13 ERROR () DefaultAirbyteStreamFactory(internalLog):88 - Traceback (most recent call last):

2022-03-10 08:35:13 ERROR () DefaultAirbyteStreamFactory(internalLog):88 -   File "/usr/local/lib/python3.7/site-packages/tap_exacttarget/__init__.py", line 135, in do_sync
I can resolve this by reducing the "pagination window" from 30 minutes to 5 minutes for example. i.e. it appears that at this time interval there is too much data that needs to be processed - hence the timeout. I am wondering whether there is another way to handle this error. There is an outstanding Pr for this connector as well https://github.com/airbytehq/airbyte/pull/10026.
a
Hi @konrad schlatte, you're approach sounds like the right one, we have a similar kind of parameter in our google analytics connector (
windows in days
). I think this should be exposed in the configuration so that users can tweak this according to the volume of data they need to handle. And thanks for the contribution!
k
ok got it thanks!
Hi @Augustin Lafanechere (Airbyte) unfortunately I am still having problems with this connections still lots of timeouts even when reducing the timebucket to the minimum. Airbyte still saves the data to snowflake until that timeout occurs but for the next sync it starts from the beginning again rather than from the last successful transfer. I suppose this is because of the error. Any ideas how we can change that behaviour so the sync starts from the last successful timebucket?