There isn't a specific recommendation for syncing 80 GB of data, as the ideal RAM and CPU for your Airbyte setup can depend on various factors such as the source and destination connectors, the complexity of the data, and the rate of data ingestion. However, there are some examples from the information extracts that can give you an idea of what others have used: 1. A user with a 285 GB table used an Airbyte server with 4 CPUs and 16 GB of RAM, but they were advised to increase the memory to improve the batch size ([source](
https://airbytehq.slack.com/archives/C021JANJ6TY/p1663179159558039)). 2. Another user with a 500 million row table from Postgres to Clickhouse used a setup with 64 GB of RAM and 8 TB of disk space ([source](
https://discuss.airbyte.io/t/2571)). 3. A user syncing a MySQL table with 200 million records used an AWS EC2 instance (m5.8xlarge) with 32 vCPUs and 128 GB of RAM ([source](
https://airbytehq.slack.com/archives/C021JANJ6TY/p1672748387444089)). You can start with a setup similar to these examples and monitor the performance during the sync process. If needed, you can adjust the resources accordingly. Additionally, you can refer to the [Airbyte documentation on scaling](
https://docs.airbyte.com/operator-guides/scaling-airbyte) for more information on optimizing your setup.