<@U04SE163WC8> lets say if i update on successful...
# ask-ai
s
@kapa.ai lets say if i update on successful sync notification webhook url then on successful sync how the json body from airbyte looks like? does it have the connection id for which the sync happened and any additional details ?
u
Yes, when you set up a webhook notification for successful syncs in Airbyte, the JSON payload sent to your webhook URL contains detailed information about the sync, including the connection ID and additional details. Here's an example of what the JSON body looks like for a successful sync notification:
Copy code
json {       "data": {           "workspace": {               "id":"b510e39b-e9e2-4833-9a3a-963e51d35fb4",               "name":"Workspace1",               "url":"<https://link/to/ws>"           },           "connection":{               "id":"64d901a1-2520-4d91-93c8-9df438668ff0",               "name":"Connection",               "url":"<https://link/to/connection>"           },           "source":{               "id":"c0655b08-1511-4e72-b7da-24c5d54de532",               "name":"Source",               "url":"<https://link/to/source>"           },           "destination":{               "id":"5621c38f-8048-4abb-85ca-b34ff8d9a298",               "name":"Destination",               "url":"<https://link/to/destination>"           },           "jobId":9988,           "startedAt":"2024-01-01T00:00:00Z",           "finishedAt":"2024-01-01T01:00:00Z",           "bytesEmitted":1000,           "bytesCommitted":1000,           "recordsEmitted":89,           "recordsCommitted":89,           "bytesEmittedFormatted": "1000 B",           "bytesCommittedFormatted":"90 B",           "success":true,           "durationInSeconds":3600,           "durationFormatted":"1 hours 0 min"       }   }
As you can see, the JSON payload includes: 1. The connection ID (
connection.id
) 2. Additional details such as: - Workspace information - Source and destination details - Job ID - Start and finish times - Bytes and records emitted and committed - Duration of the sync - Success status This comprehensive payload allows you to track and process the results of successful syncs in your external systems or workflows. [Source: Airbyte Documentation - Configure Webhook Notification Settings](https://docs.airbyte.com/cloud/managing-airbyte-cloud/manage-airbyte-cloud-notifications#configure-webhook-notification-settings)
s
@kapa.ai, Airbyte connection state is not getting saved (per stream). Any config to enable it ?
u
A message from kapa.ai
u
Based on the information provided in the knowledge sources, I can offer some insights about Airbyte's per-stream state functionality and potential configurations. Airbyte has moved away from legacy state handling to per-stream (or global) state management. This change was made to enable more granular operations like per-stream resets and parallelization. If you're experiencing issues with per-stream state not being saved, here are a few things to consider: 1. Ensure you're using an up-to-date version of Airbyte. Per-stream state was introduced in later versions, and legacy state support has been deprecated. As of [Airbyte Platform version v0.62.4](https://github.com/airbytehq/airbyte/releases/tag/v0.62.4), legacy state is no longer supported. 2. If you're using a custom connector, make sure it's built with a recent version of the Airbyte CDK (Connector Development Kit). The CDK should be version 1.3.1 or higher to support per-stream states. 3. Check if your connector is implementing the state checkpointing correctly. There are two main ways to checkpoint state: - Interval-based checkpointing - Stream Slices 4. For interval-based checkpointing, you can set the
state_checkpoint_interval
property in your stream class. For example:
Copy code
python    class MyStream(Stream):      state_checkpoint_interval = 100  # Save state every 100 records
5. If you're using stream slices, ensure that your
stream_slices
method is implemented correctly to emit state after each slice is processed. 6. Make sure your connector is emitting state messages correctly. The state should be emitted even if there are no records, to support state-based counts/checksums. If you've checked these points and are still experiencing issues, it might be helpful to review your specific connector implementation or consult the Airbyte documentation for more detailed guidance. For more information on implementing per-stream state and checkpointing, you can refer to the [Airbyte CDK Python documentation on Incremental Streams](https://docs.airbyte.com/connector-development/cdk-python/incremental-stream#checkpointing-state). If you're still facing issues after checking these configurations, it might be worth opening a new topic on the Airbyte forum or reaching out to Airbyte support for more specific assistance with your setup.