https://linen.dev logo
Join Slack
Powered by
# ask-ai
  • j

    Josh

    05/12/2023, 6:15 PM
    Just using the standard clone and bash script I get this in the log:
    Copy code
    | 172.29.0.9 - airbyte [12/May/2023:18:14:30 +0000] "POST /api/v1/workspaces/list HTTP/1.0" 502 497 "<http://localhost:8000/>" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36" "172.29.0.1"
    airbyte-webapp                    | 2023/05/12 18:14:30 [error] 26#26: *4 connect() failed (111: Connection refused) while connecting to upstream, client: 172.29.0.9, server: localhost, request: "POST /api/v1/workspaces/list HTTP/1.0", upstream: "<http://172.29.0.7:8001/api/v1/workspaces/list>", host: "localhost", referrer: "<http://localhost:8000/>"
    airbyte-proxy                     | 172.29.0.1 - airbyte [12/May/2023:18:14:30 +0000] "POST /api/v1/workspaces/list HTTP/1.1" 502 497 "<http://localhost:8000/>" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36"
    k
    • 2
    • 3
  • a

    Alan Szpigiel

    05/15/2023, 4:56 AM
    Hi! I just deployed everything on AWS EC2 but I’m having problems with the docker container
    airbyte-temporal
    . It’s not pointing to the right DB_HOST. I customized the DB:
    Copy code
    ### DATABASE ###
    # Airbyte Internal Job Database, see <https://docs.airbyte.io/operator-guides/configuring-airbyte-db>
    DATABASE_USER=airbyte_rw
    DATABASE_PASSWORD=*****
    DATABASE_HOST=prd-datawarehouse.int.main.url
    DATABASE_PORT=5432
    DATABASE_DB=airbyte
    # translate manually DATABASE_URL=jdbc:postgresql://${DATABASE_HOST}:${DATABASE_PORT}/${DATABASE_DB} (do not include the username or password here)
    DATABASE_URL=jdbc:<postgresql://prd-datawarehouse.int.main.url:5432/airbyte>
    JOBS_DATABASE_MINIMUM_FLYWAY_MIGRATION_VERSION=0.40.26.001
    An the docker-compose.yml is like this
    Copy code
    airbyte-temporal:
        image: airbyte/temporal:${VERSION}
        logging: *default-logging
        container_name: airbyte-temporal
        restart: unless-stopped
        environment:
          - DB=postgresql
          - DB_PORT=${DATABASE_PORT}
          - DYNAMIC_CONFIG_FILE_PATH=config/dynamicconfig/development.yaml
          - LOG_LEVEL=${LOG_LEVEL}
          - POSTGRES_PWD=${DATABASE_PASSWORD}
          - POSTGRES_SEEDS=${DATABASE_HOST}
          - POSTGRES_USER=${DATABASE_USER}
        volumes:
          - ./temporal/dynamicconfig:/etc/temporal/config/dynamicconfig
        networks:
          - airbyte_internal
    but the temporal server keeps restarting with the logs:
    Copy code
    2023-05-15T04:41:18.615Z	ERROR	Unable to create SQL database.	{"error": "unable to connect to DB, tried default DB names: postgres,defaultdb, errors: [pq: no pg_hba.conf entry for host \"10.0.101.173\", user \"airbyte_rw\", database \"postgres\", no encryption pq: no pg_hba.conf entry for host \"10.0.101.173\", user \"airbyte_rw\", database \"defaultdb\", no encryption]", "logging-call-at": "handler.go:97"}
    Important! host \“10.0.101.173\” ---> THIS IS THE LOCAL IP NOT
    <http://prd-datawarehouse.int|prd-datawarehouse.int>.main.url
    and database must be
    airbyte
    Did anyone had this problem?
    👀 3
    k
    s
    +4
    • 7
    • 35
  • k

    King Ho

    05/15/2023, 11:05 AM
    We seem to be running into issues where our Airbyte instances deployed using docker on VMs hosted on GCP are becoming partially non-responsive. The web interface runs, but clicking on a job in a connection has the progress wheel spin endlessly and result in a 502 error on the request (get_debug_info). After restarting the docker container, its all ok, but this is not sustainable. It almost feels like there is a caching error or problem
    k
    • 2
    • 2
  • m

    MEJDOUBI Mohamed

    05/15/2023, 4:59 PM
    Hi all, airbyte first try, i followed the basic tuto steps, using docker compose, when runing : docker compose up -d, got this error message : ERROR: Invalid interpolation format for "environment" option in service "worker": "CONFIG_DATABASE_PASSWORD=${CONFIG_DATABASE_PASSWORD:-}"
    k
    • 2
    • 2
  • k

    Kevin Ruprecht

    05/15/2023, 8:35 PM
    Trying to install Airbyte in GCP for testing out a new ELT pipeline. I've installed Airbyte before in GCP for proof-of-concept type purposes. Followed the directions on Airbyte's website for deploying on GCP VMs before just fine, but this time around I'm running into this error when running
    docker compose up -d
    .
    parsing /home/xxxxxxxxxx/airbyte/docker-compose.yaml: Non-string key at top level: 404
    k
    • 2
    • 9
  • f

    Fulvio Mascara

    05/16/2023, 4:54 AM
    Hello All ... I have Airbyte running on GCP K8S (GKE) with no issues. I'm considering to use Airflow to orchestrate ingestion in Airbyte. Now I have to connect in a AWS instance through a VPN Connection for ingesting data from some of my transactional DBs. Question: What is the best approach to create this connection, in a sense that I just need to connect to VPN during the ingestion?
    k
    • 2
    • 2
  • p

    Peter Cooper

    05/16/2023, 5:01 AM
    hello - I’m running into an error when trying to install from scratch after following the directions for local deployment on https://docs.airbyte.com/deploying-airbyte/local-deployment/
    Error response from daemon: invalid mount config for type "bind": bind source path does not exist: /airbyte/airbyte/flags.yml
    k
    • 2
    • 4
  • j

    Joish Bosco

    05/16/2023, 7:29 AM
    Hello I’m running into an error when trying to deploy airbyte using helm using https://docs.airbyte.com/deploying-airbyte/on-kubernetes-via-helm can someone please help
    kubectl_logs_airbyte-airbyte-bootloader.txt
  • r

    Rishikesh Srinivas

    05/16/2023, 8:51 AM
    Hi Team, Going For Production using Airbyte Facing problem in Audit since airbyte-db is not encrypted im using external postgres db i have deployed using kube and helm Please Help me on this , is there any setting in airbyte where it can be enabled for encryption any workaround is appreciated. Thanks, Rishikesh
    k
    r
    • 3
    • 8
  • y

    Yusuf Mirkar

    05/17/2023, 9:58 AM
    Hello everyone, if I use incremental mode deduped+history for postgres<>postgres replication and my source db is currently 100 gb, will I have to maintain destination volume size as 200 gb Reason- I read that in this mode, one intermediate table is maintained on destination will have all the update records, so for each table it will be one intermediate & one final table of almost same sizes Is this true ?
    k
    • 2
    • 20
  • y

    Yusuf Mirkar

    05/17/2023, 10:22 AM
    Hello everyone , this is continuation of previous question, the thread has reached max no. of replies, so asking outside the thread to start a new one. Destination space is doubled or more than that when using incremental deduped+history mode Why it need table_scd ? Using replication key (timestamp), why cant it pick just updated rows and UPsert/INsert them instead of maintaining table_scd ?
    k
    • 2
    • 11
  • r

    Ramkumar Vaidyanathan

    05/17/2023, 6:26 PM
    What level of permission is needed for the postgres user? we are getting this error when trying to launch airbyte with helm with our own RDS instance
    Copy code
    2023-05-17T18:24:14.367Z	ERROR	Unable to create SQL database.	{"error": "pq: permission denied to create database", "logging-call-at": "handler.go:97"}
    2023/05/17 18:24:14 Loading config; env=docker,zone=,configDir=config
    2023/05/17 18:24:14 Loading config files=[config/docker.yaml]
    {"level":"info","ts":"2023-05-17T18:24:14.680Z","msg":"Updated dynamic config","logging-call-at":"file_based_client.go:143"}
    {"level":"info","ts":"2023-05-17T18:24:14.680Z","msg":"Starting server for services","value":["history","matching","frontend","worker"],"logging-call-at":"server.go:123"}
    Unable to start server. Error: sql schema version compatibility check failed: unable to read DB schema version keyspace/database: temporal error: pq: relation "schema_version" does not exist
    k
    q
    • 3
    • 4
  • q

    Qamarudeen Muhammad

    05/18/2023, 7:15 AM
    I need help locating the appropriate dbt file use for Normalization by Airbyte. For now, I want to ensure the dbt on Airbyte work for my use case. As it stand, if I turn Basic Normalization on, everything work except I got errors on a particular rows in a particular tables, the errors is coming because value in a column is not a valid timestamp. So, what I want to achieve is customize the dbt sql for that particular model, by adding Case conditions. The challenge I face is that workspace/number/number, continues to change for each run of the connection in Airbyte UI, so I need guide to access the appropriate file used by dbt. Thanks Airbyte 0.44.3 version Error log from Airbyte
    Copy code
    Invalid timestamp: '+12699-01-07T18:00:26.673000Z'
      compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/scd/airbyte_bigquery/mytablename_scd.sql
    40 of 63 ERROR creating incremental model airbyte_bigquery.mytablename_scd....................................... [ERROR in 6.40s]
    Database Error in model tbl_purchase_order_scd (models/generated/airbyte_incremental/scd/airbyte_bigquery/mytablename_scd.sql
    )
    k
    a
    • 3
    • 4
  • y

    Yusuf Mirkar

    05/18/2023, 11:00 AM
    what is the use of airbyte-proxy ? If I am using nginx as reverse proxy, I cannot bind nginx to listen on 8000 as airbyte is already listening on 8000
    k
    • 2
    • 5
  • s

    Shivam Bhardwaj

    05/18/2023, 11:49 AM
    @kapa.ai Getting this error in deployment on kubernetes using plural: The certificate request has failed to complete and will be retried Failed to wait for order resource airbyte-tls-7dsbf-3668326671 to become ready: order is in errored state
    k
    • 2
    • 5
  • y

    Yusuf Mirkar

    05/18/2023, 11:59 AM
    how much ram needed for sync
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/18/2023, 1:03 PM
    webhook other than slack is possible ?
    k
    • 2
    • 5
  • h

    Haim Beyhan

    05/18/2023, 1:32 PM
    I am new to Airbyte. I need to deploy it on AWS EKS and I need probably to customize the deployment, configure ingress etc. Are there any suggestions regarding changes in values file or what fields have to be changed? I understand I need also physical volume. Can I use S3 for it? I don't see any documentation about customization
    k
    • 2
    • 4
  • s

    Slackbot

    05/18/2023, 2:10 PM
    This message was deleted.
    k
    j
    • 3
    • 3
  • s

    Sunny Hashmi

    05/18/2023, 4:17 PM
    👋 Hi all, we have an exciting announcement to share! Next week's Daily Airbyte Office Hours will feature Deep Dive Sessions hosted by the one and only @[DEPRECATED] Marcos Marx octavia muscle During the deep-dive sessions, Marcos will explain how Airbyte works, delving into each component in every session and explaining their functions. If you’re curious or want to learn more about Airbyte, these sessions will be truly valuable to you. For the first week we’re diving into the
    airbyte-bootloader
    and the
    airbyte-db
    services. The presentation will be 20 min, and we'll dedicate the remaining 25 min to questions about the daily topic or general Q&A. Check out the schedule below 👇 Reminders and updates will be posted in #C045VK5AF54 🔥 Deep Dive Sessions: airbyte-bootloader • Monday May 22 - 1pm PDT (zoom link) • Tuesday May 23 - 16:00 CEST / 10am EDT (zoom link) 🔥 Deep Dive Sessions: airbyte-db + Airbyte Database Internals • Wednesday May 24 - 1pm PDT (zoom link) • Thursday May 25 - 16:00 CEST / 10am EDT (zoom link) 🔥 Open Q&A • Friday May 26 - 1pm PDT (zoom link) Hope to see you there! octavia rocket
  • y

    Yusuf Mirkar

    05/18/2023, 5:32 PM
    what value to keep in JOB_MAIN_CONTAINER_CPU_REQUEST if i want to give 1 vcpu out of 2 vcpu of my ec2 instance
    k
    • 2
    • 2
  • y

    Yusuf Mirkar

    05/18/2023, 6:17 PM
    i dont want to save sync job logs , if i set TEMPORAL_HISTORY_RETENTION_IN_DAYS to 0 will it not save logs ? Logs are filling up the space
    k
    • 2
    • 2
  • a

    Aman Kesharwani

    05/18/2023, 7:19 PM
    Hi Everyone, When deploying airbyte on EKS cluster where I want to use S3 for storage and logging I am getting below error
    ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null java.lang.IllegalArgumentException: null at com.google.common.base.Preconditions.checkArgument(Preconditions.java:131) ~[guava-31.1-jre.jar:?] at io.airbyte.config.storage.DefaultS3ClientFactory.validateBase(DefaultS3ClientFactory.java:36) ~[io.airbyte.airbyte-config-config-models-0.44.0.jar:?] at io.airbyte.config.storage.DefaultS3ClientFactory.validate(DefaultS3ClientFactory.java:31) ~[io.airbyte.airbyte-config-config-models-0.44.0.jar:?] at io.airbyte.config.storage.DefaultS3ClientFactory.<init>(DefaultS3ClientFactory.java:24) ~[io.airbyte.airbyte-config-config-models-0.44.0.jar:?] at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:51) ~[io.airbyte.airbyte-config-config-models-0.44.0.jar:?] at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:226) ~[io.airbyte.airbyte-config-config-models-0.44.0.jar:?] at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:213) ~[io.airbyte.airbyte-config-config-models-0.44.0.jar:?] at io.airbyte.server.LoggingEventListener.onApplicationEvent(LoggingEventListener.java:34) ~[io.airbyte-airbyte-server-0.44.0.jar:?] at io.airbyte.server.LoggingEventListener.onApplicationEvent(LoggingEventListener.java:21) ~[io.airbyte-airbyte-server-0.44.0.jar:?] at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:262) ~[micronaut-inject-3.8.8.jar:3.8.8] at io.micronaut.context.event.ApplicationEventPublisherFactory.access$200(ApplicationEventPublisherFactory.java:60) ~[micronaut-inject-3.8.8.jar:3.8.8] at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:229) ~[micronaut-inject-3.8.8.jar:3.8.8] at io.micronaut.http.server.netty.NettyHttpServer.lambda$fireStartupEvents$15(NettyHttpServer.java:587) ~[micronaut-http-server-netty-3.8.8.jar:3.8.8] at java.util.Optional.ifPresent(Optional.java:178) ~[?:?] at io.micronaut.http.server.netty.NettyHttpServer.fireStartupEvents(NettyHttpServer.java:581) ~[micronaut-http-server-netty-3.8.8.jar:3.8.8] at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:298) ~[micronaut-http-server-netty-3.8.8.jar:3.8.8] at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:104) ~[micronaut-http-server-netty-3.8.8.jar:3.8.8] at io.micronaut.runtime.Micronaut.lambda$start$2(Micronaut.java:81) ~[micronaut-context-3.8.8.jar:3.8.8] at java.util.Optional.ifPresent(Optional.java:178) ~[?:?] at io.micronaut.runtime.Micronaut.start(Micronaut.java:79) ~[micronaut-context-3.8.8.jar:3.8.8] at io.micronaut.runtime.Micronaut.run(Micronaut.java:323) ~[micronaut-context-3.8.8.jar:3.8.8] at io.micronaut.runtime.Micronaut.run(Micronaut.java:309) ~[micronaut-context-3.8.8.jar:3.8.8] at io.airbyte.server.Application.main(Application.java:15) ~[io.airbyte-airbyte-server-0.44.0.jar:?]
    I have created a serviceaccount with attached IAM policy to access the desired S3 bucket and updated the values.yaml file accordingly Attaching the values.yaml file for reference any help will be really appreciated
    values.yaml_shared
    a
    k
    j
    • 4
    • 18
  • s

    Saptaswa Pal

    05/19/2023, 5:00 AM
    Hello Team I am trying to enable basic auth in the airbyte application which I have deployed in EKS using kubernetes with helm. I can't seem to find a way to do this in application directly, and rather need to do it by making changes in the annotations of the ingress controller (by enabling nginx). Is there way to enable basic auth directly in the application ? Kind help. TIA!
    k
    t
    • 3
    • 3
  • v

    vasu cheemakurthi

    05/19/2023, 10:17 AM
    Hello Team, I am new to Airbyte and trying to setup this in AWS EC2 instance - after all setup, am not able to connect using default airbyte logons; Can some one please help if am missing any steps
    k
    • 2
    • 3
  • o

    Octavia Squidington III

    05/19/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 Q&A - No topic, ask anything! At 1pm PDT click here to join us on Zoom!
  • l

    Louis Auneau

    05/19/2023, 8:02 PM
    Hello! I am trying to deploy Airbyte using the Helm chart on Kubernetes. I disabled the airbyte/postgres
    postgresql.enabled: false
    in order to configure my own CloudSQL database, but I am having a hard time understanding the difference between the
    global.database
    and
    externalDatabase
    values? Would you have any documentation explaining it ? Thank you by advance and have a nice day !
    k
    a
    • 3
    • 9
  • t

    Thiago Guimarães

    05/19/2023, 9:06 PM
    Hello! I am trying to enable basic auth in the airbyte application which I have deployed in GCP using kubernetes with helm. I can't seem to find a way to do this in application directly and even when setting the env variables BASIC_AUTH_USERNAME and BASIC_AUTH_PASSWORD it simply doesnt ask for authentication when trying to access airbyte. Is there anyway to configure it to do so ? Ty!
    k
    • 2
    • 2
  • j

    Joey Taleño

    05/22/2023, 10:13 AM
    Hello Team, I just saw this issue is already closed. https://github.com/airbytehq/airbyte/issues/25194 Any final decision from this issue? Will SCD be gone?
    k
    • 2
    • 2
  • h

    Haim Beyhan

    05/22/2023, 1:53 PM
    Airbyte server is failing when S3 usage is enabled. I annotated the service account with the IAM role. IAM role has policy that can get/write object. IAM role has trust relationship with the relevant oidc. What am I missing here?
    Copy code
    values.yaml
    ============
    global.state.storage.type: "S3"
    global.logs.storage.type: "S3"
    global.logs.minio.enabled: false
    global.logs.s3.enabled: true
    global.logs.s3.bucket: "XXXXXX"
    global.logs.s3.bucketRegion: "XXXXXX"
    
    error
    =======
    2023-05-22 13:49:49 ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null
    java.lang.IllegalArgumentException: null
    	at com.google.common.base.Preconditions.checkArgument(Preconditions.java:131) ~[guava-31.1-jre.jar:?]
    	at io.airbyte.config.storage.DefaultS3ClientFactory.validateBase(DefaultS3ClientFactory.java:36) ~[io.airbyte.airbyte-config-config-models-0.44.4.jar:?]
    	at io.airbyte.config.storage.DefaultS3ClientFactory.validate(DefaultS3ClientFactory.java:31) ~[io.airbyte.airbyte-config-config-models-0.44.4.jar:?]
    	at io.airbyte.config.storage.DefaultS3ClientFactory.<init>(DefaultS3ClientFactory.java:24) ~[io.airbyte.airbyte-config-config-models-0.44.4.jar:?]
    	at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:51) ~[io.airbyte.airbyte-config-config-models-0.44.4.jar:?]
    	at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:226) ~[io.airbyte.airbyte-config-config-models-0.44.4.jar:?]
    	at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:213) ~[io.airbyte.airbyte-config-config-models-0.44.4.jar:?]
    	at io.airbyte.server.LoggingEventListener.onApplicationEvent(LoggingEventListener.java:34) ~[io.airbyte-airbyte-server-0.44.4.jar:?]
    	at io.airbyte.server.LoggingEventListener.onApplicationEvent(LoggingEventListener.java:21) ~[io.airbyte-airbyte-server-0.44.4.jar:?]
    	at io.micronaut.context.event.ApplicationEventPublisherFactory.notifyEventListeners(ApplicationEventPublisherFactory.java:262) ~[micronaut-inject-3.9.0.jar:3.9.0]
    	at io.micronaut.context.event.ApplicationEventPublisherFactory.access$200(ApplicationEventPublisherFactory.java:60) ~[micronaut-inject-3.9.0.jar:3.9.0]
    	at io.micronaut.context.event.ApplicationEventPublisherFactory$2.publishEvent(ApplicationEventPublisherFactory.java:229) ~[micronaut-inject-3.9.0.jar:3.9.0]
    	at io.micronaut.http.server.netty.NettyHttpServer.lambda$fireStartupEvents$15(NettyHttpServer.java:587) ~[micronaut-http-server-netty-3.9.0.jar:3.9.0]
    	at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
    	at io.micronaut.http.server.netty.NettyHttpServer.fireStartupEvents(NettyHttpServer.java:581) ~[micronaut-http-server-netty-3.9.0.jar:3.9.0]
    	at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:298) ~[micronaut-http-server-netty-3.9.0.jar:3.9.0]
    	at io.micronaut.http.server.netty.NettyHttpServer.start(NettyHttpServer.java:104) ~[micronaut-http-server-netty-3.9.0.jar:3.9.0]
    	at io.micronaut.runtime.Micronaut.lambda$start$2(Micronaut.java:81) ~[micronaut-context-3.9.0.jar:3.9.0]
    	at java.util.Optional.ifPresent(Optional.java:178) ~[?:?]
    	at io.micronaut.runtime.Micronaut.start(Micronaut.java:79) ~[micronaut-context-3.9.0.jar:3.9.0]
    	at io.micronaut.runtime.Micronaut.run(Micronaut.java:323) ~[micronaut-context-3.9.0.jar:3.9.0]
    	at io.micronaut.runtime.Micronaut.run(Micronaut.java:309) ~[micronaut-context-3.9.0.jar:3.9.0]
    	at io.airbyte.server.Application.main(Application.java:15) ~[io.airbyte-airbyte-server-0.44.4.jar:?]
    ➕ 1
    k
    a
    +4
    • 7
    • 10
1...91011...48Latest