*Is this your first time deploying Airbyte*: No  *...
# troubleshooting
t
Is this your first time deploying Airbyte: No  OS Version / Instance: macOS 11.6.1 Memory / Disk: 16GB / 500GB SSD  Deployment: Docker Airbyte Version: 0.34.0-alpha Source name/version: MySQL 0.4.13 (0.5.0 and 0.5.1 fail the connection test) Destination name/version: S3 0.1.16 Step: On sync  Description: Extracting MySQL tables to S3 as parquet files with snappy compression is working for tables as large as 6GB so far, but trying on a 14GB table fails without reading any rows. It seems to time out after 5 minutes while waiting for query results
Copy code
[34msource[0m - 2021-12-16 18:38:36 INFO () DefaultAirbyteStreamFactory(lambda$create$0):61 - 2021-12-16 18:38:36 [32mINFO[m i.a.i.s.m.MySqlSource(getIncrementalIterators):181 - {} - using CDC: false
[34msource[0m - 2021-12-16 18:38:36 INFO () DefaultAirbyteStreamFactory(lambda$create$0):61 - 2021-12-16 18:38:36 [32mINFO[m i.a.i.s.r.AbstractRelationalDbSource(queryTableFullRefresh):35 - {} - Queueing query for table: establishments
[34msource[0m - 2021-12-16 18:43:48 INFO () DefaultAirbyteStreamFactory(lambda$create$0):61 - 2021-12-16 18:43:48 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$read$2):123 - {} - Closing database connection pool.
[34msource[0m - 2021-12-16 18:43:48 INFO () DefaultAirbyteStreamFactory(lambda$create$0):61 - 2021-12-16 18:43:48 [32mINFO[m i.a.i.s.r.AbstractDbSource(lambda$read$2):125 - {} - Closed database connection pool.
[34msource[0m - 2021-12-16 18:43:48 ERROR () LineGobbler(voidCall):82 - Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure
[34msource[0m - 2021-12-16 18:43:48 ERROR () LineGobbler(voidCall):82 - 
[34msource[0m - 2021-12-16 18:43:48 ERROR () LineGobbler(voidCall):82 - The last packet successfully received from the server was 310,727 milliseconds ago. The last packet sent successfully to the server was 310,727 milliseconds ago.
u
@Liren Tu do you know if the S3 Destination has a memory leaky problem?
l
Not that we know of. But it is possible.
@Tom Gordon, how many tables are you syncing from MySQL? Currently each stream takes a fixed amount of memory. So if there are too many tables, syncing to S3 does not work.
t
I’ve been setting up one connection per table, so it only runs one table per sync
l
How many columns does that table has?
Could you upload the full logs?
t
47 columns, ~680k rows, ~14GB
l
No clue so far. The data volume should be within the current process-able range. Sorry, two follow up questions: 1. Is it possible that there are super wide columns? For example, there is an
as_json
column. Could that column have large json blob? I am asking this because MySQL connector currently fetch 1000 rows each time. One possibility is that somehow 1000 rows take up too much memory. 2. You mentioned that “0.5.0 and 0.5.1 fail the connection test”. Do you still have / remember the error message?
t
1. yes
as_json
is just the entire row crammed into a json blob. I don’t need it but I don’t think there’s a way to exclude it. 2. I’m setting it up again quickly to pull the log Thanks for your help
l
Thank you so much for doing these extra things to help debug this.
t
The sync fails before it starts fetching rows. There’s no network activity and the memory isn’t filling up
l
Right, looking at the logs, it looks like a connection issue:
Caused by: javax.net.ssl.SSLHandshakeException: No appropriate protocol (protocol is disabled or cipher suites are inappropriate)
I have created an issue here: https://github.com/airbytehq/airbyte/issues/8890 Will continue looking into it later today.
t
👍 thanks
z
hi there - do you have news on this by any chance?
a