Im using Mysql as a source and Redshift as a desti...
# ask-ai
s
Im using Mysql as a source and Redshift as a destination. The pipeline runs with batches of 5000 records. how can i increase the batch size? Im using the oss latest version.
u
I'm sorry, but based on the provided knowledge sources, there is no specific information on how to increase the batch size when using MySQL as a source and Redshift as a destination in Airbyte. However, it's worth noting that there are performance considerations to keep in mind when dealing with batch/bulk writes. Large records may impact performance and the connector should ideally be enhanced to support variable batch sizes as mentioned in the [Elasticsearch destination documentation](https://docs.airbyte.com/integrations/destinations/elasticsearch#performance-considerations). For more specific guidance, I would recommend checking the Airbyte [documentation](https://docs.airbyte.io/) or reaching out to the Airbyte community on their [discussion forum](https://discuss.airbyte.io/).