Richard Gao
11/02/2025, 2:44 AMkapa.ai
11/02/2025, 2:44 AMRichard Gao
11/02/2025, 2:46 AMkapa.ai
11/02/2025, 2:46 AMkapa.ai
11/02/2025, 2:46 AMRichard Gao
11/02/2025, 3:06 AMkapa.ai
11/02/2025, 3:06 AMkapa.ai
11/02/2025, 3:06 AMRichard Gao
11/02/2025, 3:07 AMkapa.ai
11/02/2025, 3:07 AMkapa.ai
11/02/2025, 3:07 AMstate table of the internal configuration database. This table contains a state JSONB column that holds the state information for each connection, along with a connection_id column that links it to the corresponding connection. Other relevant columns include id, created_at, updated_at, stream_name, namespace, and type (which can be STREAM, GLOBAL, or LEGACY) [Airbyte Databases Data Catalog - state](https://docs.airbyte.com/platform/understanding-airbyte/database-data-catalog#state). You can query the state for a specific connection using: sql SELECT * FROM state WHERE connection_id = '<your-connection-id>'; This will return the state records associated with that connection.Richard Gao
11/02/2025, 3:08 AMkapa.ai
11/02/2025, 3:08 AMkapa.ai
11/02/2025, 3:08 AMstate table. The table includes created_at and updated_at timestamp columns, which indicate when each state record was created and last modified. This means you can see previous state records for a connection, not just the latest one. The presence of these timestamps and the structure of the table suggest that Airbyte keeps a history of state changes, rather than only storing the most recent state [Airbyte Databases Data Catalog - state](https://docs.airbyte.com/platform/understanding-airbyte/database-data-catalog#state). You can query the table to view historical states for a connection, for example: sql SELECT * FROM state WHERE connection_id = '<your-connection-id>' ORDER BY created_at DESC; This will show you the sequence of state records over time for that connection.Richard Gao
11/02/2025, 3:41 AMkapa.ai
11/02/2025, 3:41 AMkapa.ai
11/02/2025, 3:41 AMairbyte_workspace volume. You can browse these logs by mounting the volume in a container and navigating to the relevant job attempt directory (e.g., /data/9/2/ inside the container corresponds to /tmp/workspace/9/2/ on the host) [Exploring Local Logs](https://docs.airbyte.com/platform/operator-guides/browsing-output-logs#exploring-local-logs). - Cloud Storage (S3, GCS, Minio): You can configure Airbyte to store logs in an S3 bucket, GCS bucket, or Minio by setting the appropriate environment variables (S3_BUCKET, GCS_LOG_BUCKET, etc.) in your deployment [Configuring Airbyte - Logging](https://docs.airbyte.com/platform/operator-guides/configuring-airbyte#logging). - In the UI: Logs for each sync job are accessible via the Airbyte UI under the connection's "Job History" tab. You can view, download, or copy links to these logs directly from the UI [Browsing logs](https://docs.airbyte.com/platform/operator-guides/browsing-output-logs). If you need to access logs outside the UI, you can do so by browsing the Docker volume or the configured cloud storage location.