Devesh Verma
11/03/2025, 7:01 AMkapa.ai
11/03/2025, 7:01 AMjson { "data": { "workspace": { "id": "b510e39b-e9e2-4833-9a3a-963e51d35fb4", "name": "Workspace1", "url": "<https://link/to/ws>" }, "connection": { "id": "64d901a1-2520-4d91-93c8-9df438668ff0", "name": "Connection", "url": "<https://link/to/connection>" }, "source": { "id": "c0655b08-1511-4e72-b7da-24c5d54de532", "name": "Source", "url": "<https://link/to/source>" }, "destination": { "id": "5621c38f-8048-4abb-85ca-b34ff8d9a298", "name": "Destination", "url": "<https://link/to/destination>" }, "jobId": 9988, "startedAt": "2024-01-01T00:00:00Z", "finishedAt": "2024-01-01T01:00:00Z", "bytesEmitted": 1000, "bytesCommitted": 1000, "recordsEmitted": 89, "recordsCommitted": 89, "bytesEmittedFormatted": "1000 B", "bytesCommittedFormatted": "90 B", "success": true, "durationInSeconds": 3600, "durationFormatted": "1 hours 0 min" } } You can configure Airbyte to send this webhook to your external endpoint, and then process the metadata as needed on your side. This is the closest built-in solution Airbyte provides for your use case. For more details on setting up webhook notifications, see the documentation: [Set up webhook notifications](https://docs.airbyte.com/platform/next/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications#set-up-webhook-notifications). If you need to send more granular or custom metadata (e.g., per document rather than per sync), you would need to implement a custom solution outside of Airbyte, such as a post-processing step triggered by the webhook or by monitoring the S3 bucket for new files. In summary: - Airbyte can send sync-level metadata to an external endpoint via webhooks. - There is no built-in feature to send per-document metadata to an external endpoint during the sync process itself. If you need more fine-grained integration, you will need to build a custom workflow around Airbyte’s existing capabilities.Devesh Verma
11/03/2025, 7:59 AMkapa.ai
11/03/2025, 8:01 AMkapa.ai
11/03/2025, 8:01 AMDevesh Verma
11/03/2025, 8:10 AMkapa.ai
11/03/2025, 8:10 AMkapa.ai
11/03/2025, 8:10 AMairbyte/airbyte-integrations/connectors/destination-s3 For more details on contributing or modifying connectors, refer to the official documentation: [Contributing to Airbyte](https://docs.airbyte.com/platform/contributing-to-airbyte) and [Developing Locally: Connector Contributions](https://docs.airbyte.com/platform/contributing-to-airbyte/developing-locally#connector-contributions).Devesh Verma
11/03/2025, 6:11 PM1. Fork the Airbyte repository on GitHub.
2. Clone your fork locally.
3. Make your changes to the S3 destination connector code.
4. Build and test your custom connector locally.
5. Update your Airbyte instance to use your custom connector image.
When I checked code in destination-s3, I find that the actual upload happens from airbyte-cdk and that is where we need to make change.
The destination-s3 handles the configuration part.kapa.ai
11/03/2025, 6:12 PMkapa.ai
11/03/2025, 6:13 PMairbytehq/airbyte-python-cdk](https://github.com/airbytehq/airbyte-python-cdk) ([Python CDK documentation](https://docs.airbyte.com/platform/connector-development/cdk-python)), but for the S3 destination using the Bulk Load CDK, you'll need to work with the Kotlin-based CDK code within the main Airbyte repository. This is a more complex customization than just modifying the connector configuration, as you'll be working with the underlying CDK framework itself.