Hello all, we are currently setting up an ELT int...
# ask-community-for-troubleshooting
f
Hello all, we are currently setting up an ELT integration for a customer reading data from MySQL and writing that data to Snowflake. We are using the Cloud-Hosted Version of Airbyte and are still in the Trial-Period. While the ELT process itself worked and we see the synchronised data both in Snowflake and in the destination logs, the sync summary says that 0 bytes and 0 records were written. As we are still testing, all streams are using Full Refresh + Overwrite. I tried to find the reason for this error but could not find any issues raised on github or other forums. Am i missing something or is this a known issue?
k
Based on the information in the knowledge sources, there are a few relevant points regarding your issue with Airbyte Cloud showing "0 bytes and 0 records written" in the sync summary, even though data appears in Snowflake and the logs: 1. Sync Summary Calculation: Airbyte Cloud calculates the data volume and record counts based on what is observed by the Airbyte Platform during the sync, not directly from the destination database. The data is serialized in the Airbyte Protocol format during transit, which may differ from what you see in your destination. However, the summary should still reflect the number of records and bytes moved, even if the size differs from the final storage size in Snowflake. If the summary shows zero, this could indicate a reporting or UI issue rather than a data movement problem, especially if you see the data in Snowflake and in the logs. [Review connection timeline](https://docs.airbyte.com/platform/next/cloud/managing-airbyte-cloud/review-connection-timeline#syncs-refreshes-and-clears) 2. Known Issues with Sync Reporting: There are no explicit reports in the provided sources of this exact issue (data present in destination but sync summary shows zero records/bytes) for MySQL to Snowflake on Airbyte Cloud. However, there are several reports of syncs that either fail to move data, show discrepancies, or have issues with large tables, but these typically involve missing data in the destination as well, not just in the summary. 3. Potential UI or Backend Bug: Since you are seeing the data in both Snowflake and the logs, but not in the sync summary, this could be a UI or backend metrics reporting bug. There are forum posts and GitHub issues where users have seen unexpected or confusing sync size reporting, especially with Full Refresh + Overwrite mode, but not exactly your scenario. For example, one user noted that the sync size displayed in Airbyte was much larger than what was written to the destination, due to the way Airbyte serializes data for transfer, but the record count was not reported as zero in those cases. [Forum post: sync size confusion](https://discuss.airbyte.io/t/1579) 4. Next Steps: - Double-check that your syncs are marked as "Succeeded" in the Airbyte UI. - Review the logs for any warnings or errors related to metrics or reporting. - If the issue persists, it may be worth opening a support ticket or GitHub issue with Airbyte, as this could be a bug specific to the Cloud platform or the current version. Summary: There is no explicit documentation or issue in the provided sources describing your exact scenario (data present in destination, but sync summary shows zero records/bytes). This may be a UI or metrics reporting bug in Airbyte Cloud. Since you are still in the trial period, reaching out to Airbyte support with your findings and logs would be the best next step. If you need to understand how Airbyte calculates and displays sync metrics, you can refer to the official documentation: [Review connection timeline](https://docs.airbyte.com/platform/next/cloud/managing-airbyte-cloud/review-connection-timeline#syncs-refreshes-and-clears).
h
Hi Florian, thank you for evaluating Airbyte and for reaching out to the community. Its good to hear the data has synced over but strange the connection states 0 records and bytes were moved. Were you able to look at the connection logs themselves? Without much context it would be difficult to guess or pinpoint what is going on
f
Hi Henry, thanks for reaching out, it appears the issue has resolved itself on later runs of the stream, but these were done a few days later. Right now it is working without issues