Tarcisio Pacheco
10/25/2024, 7:19 PMuser
10/25/2024, 7:19 PMJOB_MAIN_CONTAINER_MEMORY_REQUEST JOB_MAIN_CONTAINER_MEMORY_LIMIT
Increase these values to allocate more memory to the sync job. 2. Adjust CPU allocation: Similarly, you can increase the CPU allocation if needed using these environment variables: JOB_MAIN_CONTAINER_CPU_REQUEST JOB_MAIN_CONTAINER_CPU_LIMIT
3. Consider scaling your infrastructure: If you're running Airbyte on a VM or Kubernetes cluster, you might need to scale up your resources. As [mentioned in a forum post](https://discuss.airbyte.io/t/861), increasing the overall memory of your Airbyte instance can help with large syncs. Regarding configuring batch size for the Oracle source, unfortunately, the knowledge sources don't provide specific information about configuring batch size for the Oracle connector. However, it's worth noting that Airbyte has implemented some improvements in this area. As [mentioned in a forum post](https://discuss.airbyte.io/t/1168), recent versions of Airbyte connectors use a dynamic batch fetch system, which should help with performance for large datasets. If these suggestions don't resolve your issue, you might want to consider: 1. Breaking down your sync into smaller chunks, perhaps by using incremental sync if possible. 2. Checking for any potential bottlenecks in your Oracle database configuration. 3. Reaching out to Airbyte support or creating a GitHub issue for further investigation, as large-scale performance issues might require more in-depth troubleshooting. Remember, syncing very large datasets can be challenging and may require fine-tuning of both Airbyte and your infrastructure configuration.