s
05/07/2025, 1:07 PMTimmy Walters
05/07/2025, 1:22 PMtombstone = true
for actor
, status = 'deprecated'
for connection
) — even though we haven’t done an upgrade recently.
This caused the UI/API to show no workspaces, connections, or sources, and threw errors like:
TypeError: Cannot read properties of undefined (reading 'workspaceId')
and
Failed to get organization id for source id: ...
Any idea why this might’ve happened? Was there a script or cleanup job that may have flagged everything as deleted? Just trying to understand so we can prevent it from happening again.
Thanks!Sirine Hdiji
05/07/2025, 1:31 PMDatetimeStreamSlicer
to generate monthly date slices between a start_date
and end_date
defined in the config : 202501, 202502...
My goal is to pass the date
from each slice (in the format YYYYMM
) as a required query parameter in the API request. Here's a simplified version of my YAML configuration:
stream_slicer:
type: DatetimeStreamSlicer
cursor_field: date
start_datetime: "{{ config['start_date'] }}"
end_datetime: "{{ config['end_date'] }}"
step: "1M"
datetime_format: "%Y%m"
cursor_granularity: "P1M"
retriever:
type: SimpleRetriever
requester:
request_parameters:
date: "{{ stream_slice['date'] }}"
However, in the actual requests, the date
parameter is missing, and the API returns a 400 error:
"must have required property 'date'"
.
is DatetimeStreamSlicer
still supported or has it been deprecated in favor of another approach?
also, what are the best practices to pass dynamic query parameters ? Any tips or examples would be appreciated!
Thanks a lot 🙏Théo
05/07/2025, 1:44 PMHari Haran R
05/07/2025, 1:54 PMHari Haran R
05/07/2025, 2:01 PMSree Shanthan Kuthuru
05/07/2025, 2:20 PMmessage='Airbyte could not start the process within time limit. The workload was claimed but never started.', type='io.airbyte.workers.exception.WorkloadMonitorException', nonRetryable=false
Charles Bockelmann
05/07/2025, 2:29 PMMoe Hein Aung
05/07/2025, 2:47 PMWorkload failed, source: workload-monitor-heartbeat Airbyte could not track the sync progress. Sync process exited without reporting status
I did research online this seems to be because the storage for MinIO logs ran out of space. As suggested on this github issue, I set about deleting and re-creating this MinIO pod and PVs attached to it as a fix.
I started by running the following:
kubectl get statefulset -n airbyte
kubectl get pvc -n airbyte
kubectl get pv -n airbyte
Then deleted them all successfully:
kubectl delete statefulset airbyte-minio -n airbyte
kubectl delete pvc airbyte-minio-pv-claim-airbyte-minio-0 -n airbyte
kubectl delete pv pvc-26a02143-f688-4674-a49f-1335e8c74cca
In the process I also did helm repo update to update from 1.5.1 to 1.6.1 then tried helm upgrade:
helm upgrade --install airbyte airbyte/airbyte --version 1.6.1 -n airbyte -f values.yaml --debug
However, it gets stuck with Pod airbyte-minio-create-bucket running
and minio pod does not get created. This is part of my helm chart:
# for v1.6.1
logs:
storage:
type: minio
minio:
enabled: true
persistence:
enabled: true
size: 20Gi
storageClass: gp2
# old for v1.5.1 commented out
# minio:
# persistence:
# enabled: true
# size: 20Gi
# storageClass: "gp2"
Chris
05/07/2025, 2:48 PMGary James
05/07/2025, 3:07 PMChris Brunton
05/07/2025, 3:22 PMMAX_SYNC_WORKERS
setting if I am using the Open Source version of Airbyte, version 1.6, on an AWS EC2 instance?Nicolas Scholten
05/07/2025, 4:03 PMTypeError: Cannot read properties of undefined (reading 'email')
Caleb NXTLI
05/07/2025, 4:45 PMOmree Gal-Oz
05/07/2025, 5:58 PMChris
05/07/2025, 9:26 PMMichael Johnsey
05/07/2025, 10:13 PMValueError: No format in ['%ms', '%ms'] matching
but it's not clear to me which stream is causing the issue. Is there any way to figure that out?Brandon Rickman
05/08/2025, 12:29 AMSean Stach
05/08/2025, 2:15 AMLakshmipathy
05/08/2025, 5:00 AMStreamReadConstraints.getMaxStringLength()
) (through reference chain: io.airbyte.connectorbuilderserver.api.client.model.generated.StreamRead["slices"]->java.util.ArrayList[0]->io.airbyte.connectorbuilderserver.api.client.model.generated.StreamReadSlicesInner["pages"]->java.util.ArrayList[0]->io.airbyte.connectorbuilderserver.api.client.model.generated.StreamReadSlicesInnerPagesInner["response"]->io.airbyte.connectorbuilderserver.api.client.model.generated.HttpResponse["body"])
can you please suggest like how fix this issueLakshmipathy
05/08/2025, 5:19 AMMICRONAUT_SERVER_MAX_REQUEST_SIZE
can you give the stepsJulian Andersen
05/08/2025, 7:25 AMVasily Safronov
05/08/2025, 8:02 AMSyed Hamza Raza Kazmi
05/08/2025, 8:27 AMFasih Ullah
05/08/2025, 8:35 AMNamratha D
05/08/2025, 9:27 AMglobal:
env_vars:
MICRONAUT_SERVER_MAX_REQUEST_SIZE: "104857600" # For HTTP request size
JACKSON_DATABIND_MAX_STRING_LENGTH: "30000000" # Increase Jackson string length limit
and i have reinstalled airbyte
Is there any other approaches that this issue can be solved?Lakshmipathy
05/08/2025, 10:44 AMkanzari soumaya
05/08/2025, 10:49 AMInduprakash Gowreesan
05/08/2025, 11:12 AMhiwot tadese
05/08/2025, 11:42 AM