Kenneth Fernandez
06/19/2025, 7:58 PMJustin Frye
06/22/2025, 9:26 PMabctl local install --insecure-cookies
, I still get the error
your credentials were correct, but the server failed to set a cookie. You appear to have deployed over HTTP. Make sure you have disabled secure cookies.
This is running on a droplet in Digital Ocean and on ubuntu 24.04LTS
Installed via the same steps as above
I am currently trying the --no-browser method, but I would ideally like to still use a browser to setup the connectors. This is simply a PoC so I am not really looking to setup a cert for this. However, if that is going to resolve my issue-I am all ears and would be extremely grateful if someone had steps to do thisGiacomo Chiarella
06/23/2025, 8:12 AMPranay Gangaram Deokar
06/23/2025, 9:42 AMBernardo Fernandes
06/23/2025, 11:51 AMunknown status code returned: Status 500
│ {"status":500,"type":"<https://reference.airbyte.com/reference/errors>","title":"unexpected-problem","detail":"An
│ unexpected problem has occurred. If this is an error that needs to be
│ addressed, please submit a pull request or github
│ issue.","documentationUrl":null,"data":{"message":"java.util.concurrent.ExecutionException:
│ io.airbyte.data.exceptions.ConfigNotFoundException: config type:
│ STANDARD_WORKSPACE id: cc585b57-9873-44db-823e-4bc3ed02180f"}}
Aneela Saleem
06/23/2025, 2:20 PMPedro Roque
06/23/2025, 9:45 PMglobal:
jobs:
resources:
requests:
memory: 16Gi
limits:
memory: 16Gi
and ran abctl local install --values .airbyte/values.yaml
but it looks like it didn't work
is this the correct way to increase memory?Aaron Robins
06/24/2025, 3:32 AMLouis Gabilly
06/24/2025, 4:05 PMcurl -X POST "<http://airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local:8001/api/v1/connections/sync>" -H "Accept: application/json" -H "Content-Type: application/json" -d '{"connectionId":"b971a860-254b-4e0a-b4ff-db3e4761f3be"}'
When I try to run the following command to get the job status :
curl --request GET \
--url <http://airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local:8001/v1/jobs/1518> \
--header 'accept: application/json'
I get a weird HTML saying "You need to enable JavaScript to run this app."
I have tried many settings (I share the last one of them) to setup a Airbyte connection within the Airflow interface, but both Airbyte and HTTP connection keep failing.
airflow.exceptions.AirflowException: HTTPConnectionPool(host='airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local', port=80): Max retries exceeded with url: /v1/applications/token (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7ff5616a7590>, 'Connection to airbyte-prod-airbyte-server-svc.airbyte.svc.cluster.local timed out. (connect timeout=None)'))
Can you guys help me see what I am missing ?Alec Sharp
06/25/2025, 8:29 AMIdan Moradov
06/25/2025, 10:15 AMMert Ors
06/25/2025, 10:16 AMOleksandr Riabyi
06/25/2025, 10:42 AM./gradlew :airbyte-integrations:connectors:destination-mysql:build -x test -x integrationTestJava
3. This created the image airbyte/destination-mysql:dev
. I then tagged and pushed it to my public Docker Hub account:
docker tag df5787833601 olria97/testmysql_3:latest
docker push olria97/testmysql_3:latest
4.
However, when I try to add this connector as a destination in the Airbyte UI, I receive the following error:
An unexpected error occurred. Please report this if the issue persists. (HTTP 500)
I suspect it might be related to the image architecture or compatibility. Here’s some info about my setup:
• Python: 3.11.9
• Java: OpenJDK 21 (JAVA_HOME=/opt/homebrew/opt/openjdk@21
)
• Architecture: arm64
• CPU: Apple M3
I also tried passing platform-specific flags during the build:
./gradlew ... -Dos.arch=amd64 -Dos.name=Linux
…but that didn’t seem to help.
Can anyone help me understand what might be going wrong? Should I be building the image differently to support Airbyte on this setup?
Thanks in advance!Mert Karabulut
06/25/2025, 12:00 PMJacob Batt
06/25/2025, 2:35 PMMohd Asad
06/25/2025, 7:58 PMhelm repo add airbyte
<https://airbytehq.github.io/helm-charts>
helm install my-airbyte airbyte/airbyte --version 1.7.0
The core components are running fine. However, when I create a source and destination and trigger a sync, a new replication job pod is created. This pod includes three containers—`source`, destination
, and `orchestrator`—and it requests a total of 4 CPUs, which is too high for my environment.
I attempted to reduce the CPU and memory usage by setting the following values in my `values.yaml`:
global:
jobs:
resources:
requests:
cpu: 250m
memory: 256Mi
limits:
cpu: 500m
memory: 512Mi
I also tried setting these environment variables:
JOB_MAIN_CONTAINER_CPU_REQUEST
JOB_MAIN_CONTAINER_CPU_LIMIT
JOB_MAIN_CONTAINER_MEMORY_REQUEST
JOB_MAIN_CONTAINER_MEMORY_LIMIT
Despite these changes, the replication job pods are still requesting 4 CPUs. I’m looking for a reliable way to reduce their resource requests to around
1.5 to 2 CPUs
in total.Jordan Lee
06/26/2025, 4:42 AMPhat Luu Huynh
06/26/2025, 7:00 AMDEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): <http://airbyte.staging.data-engineering.myteksi.net:443|airbyte.staging.data-engineering.myteksi.net:443>
send: b'GET /api/v1/health HTTP/1.1\r\nHost: <valid-host>\r\nuser-agent: speakeasy-sdk/python 0.52.2 2.474.15 1.0.0 airbyte-api\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'
reply: 'HTTP/1.1 401 Unauthorized\r\n'
header: Content-Type: text/html
header: Date: Thu, 26 Jun 2025 05:59:26 GMT
header: WWW-Authenticate: Basic realm=""
header: Content-Length: 172
header: Connection: keep-alive
DEBUG:urllib3.connectionpool:<valid-host> "GET /api/v1/health HTTP/1.1" 401 172
Error: API error occurred: Status 401
<html>
<head><title>401 Authorization Required</title></head>
<body>
here's my code:
client = airbyte_api.AirbyteAPI(
server_url=Config.AIRBYTE_SERVER_URL,
security=models.Security(
basic_auth=models.SchemeBasicAuth(
username=Config.AIRBYTE_USERNAME,
password=Config.AIRBYTE_PASSWORD
)
)
)
try:
health = client.health.get_health_check()
print(health)
except errors.SDKError as e:
print(f"Error: {e}")
Assume that all my credentials and configs are valid (because I've alr tested on Postman and it responsed successfully).
I asked Cursor to diagnose the error and it said that there's a problem with Basic Authentication of the SDK.
Did I missed any important steps? Could you kindly advice? TYSM! thanks
Version:
• python: 3.10.13
• airbyte-api: 0.52.2Durim Gashi
06/26/2025, 9:18 AM2025-06-26 11:13:54 replication-orchestrator INFO Failures: [ {
"failureOrigin" : "source",
"internalMessage" : "Source process exited with non-zero exit code 1",
"externalMessage" : "Something went wrong within the source connector",
"metadata" : {
"attemptNumber" : 4,
"jobId" : 40576074,
"connector_command" : "read"
},
"stacktrace" : "io.airbyte.workers.internal.exception.SourceException: Source process exited with non-zero exit code 1\n\tat io.airbyte.container.orchestrator.worker.SourceReader.run(ReplicationTask.kt:209)\n\tat io.airbyte.container.orchestrator.worker.SourceReader$run$1.invokeSuspend(ReplicationTask.kt)\n\tat kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)\n\tat kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:100)\n\tat io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141)\n\tat io.micrometer.core.instrument.Timer.lambda$wrap$2(Timer.java:199)\n\tat datadog.trace.bootstrap.instrumentation.java.concurrent.Wrapper.run(Wrapper.java:47)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1583)\n",
"timestamp" : 1750929234687
} ]
Arun Daniel
06/26/2025, 10:19 AMXavier Van Ausloos
06/26/2025, 1:29 PM<http://localhost:8081/api/v1/workspaces/list>
Got 404 not found error
API works well for getting all connections (with basic auth): <http://localhost:8081/api/v1/connections/list>
I am using Airbyte 1.7.1 deployed thanks to HELM
@kapa.ai any idea ?Stockton Fisher
06/26/2025, 3:00 PMTom Holder
06/26/2025, 5:04 PMPrashanth Mohan
06/26/2025, 6:19 PMCONTAINER_ORCHESTRATOR_ENABLED
to false
would allow environment variables with the JOB_DEFAULT_ENV
prefix to be passed on to spawned jobs, but this strategy did not work. Is there another way that was introduced to accomplish this?
(If it comes down to it, maybe we can build custom docker images with the environment variables, but it would be nice to have a way to this via the Helm chart directly)Kanchal Karale
06/27/2025, 10:14 AMUsman Pasha
06/27/2025, 10:22 AMJustin Beasley
06/27/2025, 2:11 PMHảo Phan
06/27/2025, 3:50 PMHảo Phan
06/27/2025, 3:50 PM