https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • c

    Caio César P. Ricciuti

    05/16/2022, 6:36 AM
    Hello all, hope you're all fine! I updated my airbyte instance to version
    0.37.0-alpha
    since this update I'm getting an error with my Google Search Console connectors... Does anyone experiencing this problem as well? Any fix/advices? Error :
    Copy code
    Exception('Error while refreshing access token: 400 Client Error: Bad Request for url: <https://oauth2.googleapis.com/token>')
    
    Logs are emtpy!
    👀 1
    m
    i
    • 3
    • 5
  • e

    Eric Gill

    05/16/2022, 7:28 AM
    Hey guys, my name is Eric and I am doing a little bit of experimentation with Airbyte to see if I can make a multi-node clickhouse destination. I am fairly unfamiliar with Java and I am having trouble building the project locally with gradle using the command specified in the docs
    Copy code
    SUB_BUILD=PLATFORM ./gradlew build
    The error I am getting is:
    Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
    has anyone had any similar issues? Thanks in advance, Eric
    m
    • 2
    • 4
  • h

    Harvey Marshall

    05/16/2022, 12:33 PM
    Hi Guys, Any advice on this error:
    Could not connect with provided configuration. Error: com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 1 path $
    I got it after testing the bigQuery source is it to do with my service account?
    m
    • 2
    • 3
  • m

    Maëlise Castel

    05/16/2022, 1:12 PM
    Hi, I built my own source which is perfectly working on the airbyte I have locally but now I must make it work on the airbyte on the dev env. To do so I deployed the docker image on gitlab in the container registry and I try to add the source in the airbyte UI. However I get the following error. Could it be because I did not build correctly the image or is there another reason please?
    m
    • 2
    • 1
  • b

    Bhaumik Shah

    05/16/2022, 2:29 PM
    Hello team, I am checking if any way to support AVRO File Format in the FILE (SFTP) source. I see a PR in progress for the S3 source: https://github.com/airbytehq/airbyte/pull/12602, and checking if the same can be applied to the FILE source.
    ✅ 1
    m
    • 2
    • 2
  • m

    Mohit Reddy

    05/16/2022, 10:51 PM
    Hi! I am looking for some help to run Airbyte locally (and if possible, develop on k8s) with these changes - https://github.com/airbytehq/airbyte/pull/12876 For local development, I have been following - https://docs.airbyte.com/contributing-to-airbyte/developing-locally#run-in-dev-mode-with-docker-compose, but on the UI, when I try to setup the kafka destination connector, I see the error on the UI (and the server logs):
    Copy code
    2022-05-16 22:37:08 INFO i.a.w.t.TemporalAttemptExecution(get):108 - Docker volume job log path: /tmp/workspace/52acf6ef-3402-42c6-94c2-d8a05ff1e092/0/logs.log
    2022-05-16 22:37:08 INFO i.a.w.t.TemporalAttemptExecution(get):113 - Executing worker wrapper. Airbyte version: dev
    2022-05-16 22:37:08 INFO i.a.c.i.LineGobbler(voidCall):82 - Checking if airbyte/destination-kafka:0.1.8 exists...
    2022-05-16 22:37:08 INFO i.a.c.i.LineGobbler(voidCall):82 - airbyte/destination-kafka:0.1.8 not found locally. Attempting to pull the image...
    2022-05-16 22:37:11 INFO i.a.c.i.LineGobbler(voidCall):82 - Image does not exist.
    2022-05-16 22:37:11 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):161 - Completing future exceptionally...
    io.airbyte.workers.WorkerException: Error while getting checking connection.
    ....
    I had to bump the version I had to bump the version of destination-kafka from 0.1.7 -> 0.1.8 given the change. I thought, that just changing the *`_definitions.yaml` file and running
    ./gradlew :airbyte-config:init:processResources
    would update the
    _specs.yaml
    file - this failed, so I had to manually update the _specs.yaml file as well. Please do let me know if I am missing on any information. Any leads here highly appreciated 😄
  • m

    Maëlise Castel

    05/17/2022, 1:26 PM
    Hi, is it possible to use the classes of airbyte via airbyte.mycloudrepo.io/public/repositories/airbyte-public-jars ? I'm trying to add them in my gradle to build my destination (that I build outside of the airbyte repo) but it never recognizes the airbyte classes. Here's a part of what I did in my gradle.build
    Copy code
    repositories {
        mavenCentral()
        maven {
            url '<https://airbyte.mycloudrepo.io/public/repositories/airbyte-public-jars/>'
        }
    }
    m
    c
    • 3
    • 5
  • a

    Alec Dou

    05/17/2022, 2:25 PM
    Hi Team, I'm building a project and would like to use AirByte to integrate data sources. I'm still figuring out how airbyte works and wanting to ask if I can access all features of airbyte without the web UI but only the APIs?
    a
    m
    • 3
    • 2
  • a

    Ayush

    05/18/2022, 7:05 AM
    Hi, Is there any way i can use airbyte for always streaming jobs, for example the source can be a web socket which is always sending data.
    m
    • 2
    • 1
  • y

    Yash Makwana

    05/18/2022, 2:24 PM
    https://discuss.airbyte.io/t/rest-api-to-bigquery/1015 Can someone assist me with this issue?
    m
    • 2
    • 1
  • j

    Jared Zhao (Polyture)

    05/18/2022, 6:22 PM
    Does anyone know if we can use Airbyte Cloud’s API? The use case is this: https://airbyte.com/embed-airbyte-connectors-with-api
    a
    a
    a
    • 4
    • 8
  • a

    Andras N.

    05/18/2022, 8:06 PM
    Hi everyone, I was playing around with Airbyte and the Google Ads connector and until recently using only the ad_group_ad_report stream I was able to get all the costs/metrics associated with a google ad account. However from my latest tests with this stream I observed that it is missing a lot of clicks and costs (as far as I was able to check Smart Campaign clicks, DSA's are not included anymore). I am not sure if this is an airbyte sync issue or something else, hope somebody can shed some light on it. Thanks!
    m
    • 2
    • 5
  • r

    Rodrigo Capuzzi

    05/18/2022, 8:33 PM
    Hi, folks, I am trying to set up a connection between an Airtable and a PostgreSQL, and I am getting the following error below. Am I missing something?
    m
    k
    • 3
    • 10
  • c

    Craig Condie

    05/18/2022, 11:17 PM
    Hi all... I kind of have a double whammy of being new to airbyte and just starting at a new company. I am wanting to set up a source connector that is a Postgres database. I enter the information requested and it fails to connect with the error: "The connection tests failed. Could not connect with provided configuration. Error: HikariPool-1 - Connection is not available, request timed out after 30001ms." However, when I enter the exact same info in DataGrip or DBeaver, it will connect successfully. Any ideas on what I'm doing wrong?
  • u

    이진규

    05/19/2022, 12:45 AM
    Hi Team~ I have some questions. I'm trying to move all the files in my s3 bucket (log file compressed to gz) to another s3 bucket using airbyte. However, all attempts failed because the s3 source connector only supported csv, parquet, and avro file formats. is there a way to move all files in the s3 bucket to another s3 bucket using airbyte?
  • e

    Erez Zohar

    05/19/2022, 9:42 AM
    is it possible to implement custom connector in the cloud edition?
    m
    • 2
    • 2
  • í

    Íñigo

    05/19/2022, 10:41 AM
    Hi team! I've seen that there is an Airbyte Source Conector for Azure Table Storage, but not for Azure Blob Storage. I want to upload a CSV to Azure Blob Storage and then move it to another destination using Airbyte. Is there a plan to include that source connector soon? Thanks in advance! 🙂
  • p

    Petro Tiurin

    05/19/2022, 11:12 AM
    Hi team, What qualifies a connector to be available on Airbyte Cloud? I’ve implemented a connector that works on Airbyte Open Source and wondering if there’s anything that needs to be added to make it cloud-compatible as well.
    m
    • 2
    • 1
  • s

    Slackbot

    05/19/2022, 1:31 PM
    This message was deleted.
    m
    a
    • 3
    • 4
  • m

    Mikhail

    05/19/2022, 1:47 PM
    Hi folks! Is there any way to modify the webhook (success/fail notification) message to include the connection id?
  • s

    Sergi van den Berg

    05/19/2022, 1:58 PM
    Hi all, does anybody know where you can find the database URL for mongodb?
    m
    • 2
    • 8
  • l

    Lavanya Siliveri

    05/20/2022, 4:27 AM
    Hi Team, noob here 👩 I was going through the documentation for Connector Development Kit and noticed that the “Airbyte Specification” link is broken. Is it just me or?
    ➕ 1
    p
    m
    a
    • 4
    • 4
  • s

    Slackbot

    05/20/2022, 3:53 PM
    This message was deleted.
    c
    • 2
    • 1
  • a

    Anton Podviaznikov

    05/20/2022, 5:11 PM
    How would I change
    User-Defined Cursor
    I don't see input to change that in the UI. And I don't see anything like the screenshot in the docs https://docs.airbyte.com/understanding-airbyte/connections/incremental-append#user-defined-cursor
    m
    • 2
    • 3
  • d

    Dimitris Bachtsevanis

    05/20/2022, 8:57 PM
    Hi, I am using Google Ads source connector with custom GAQL but the query doesn't return PAUSED entities ie Campaigns. Any ideas why this is happening?
    m
    • 2
    • 1
  • d

    Dimitris Bachtsevanis

    05/20/2022, 8:57 PM
    same query works fine here https://www.awql.me/gaql
  • a

    Anton Podviaznikov

    05/21/2022, 8:47 AM
    Hi folks. How would I speed up my sync jobs. I'm using latest version of airbyte (in k8s cluster). I tried to sync PG to snowflake and got this result
    Copy code
    49.25 GB | 32,773,497 emitted records | 32,773,497 committed records | 2h 32m 18s | Sync
    So the speed is around 5MB/s. How would I make it faster? I increased
    Copy code
    SUBMITTER_NUM_THREADS=40
    MAX_SYNC_WORKERS=20
    as was described here https://discuss.airbyte.io/t/scaling-airbyte-on-k8s-increased-job-parallelism/826 But I wasn't sure how to increase number of workers. Also what else can I tune to make jobs go faster?
    m
    d
    l
    • 4
    • 5
  • c

    CM

    05/22/2022, 2:05 PM
    Can anyone help me with this conversion error?
    Copy code
    Is this your first time deploying Airbyte?: Yes
    OS Version / Instance: Windows
    Memory / Disk: 16 gb / 1 tb
    Deployment: Docker
    Airbyte Version: Latest
    Source name/version: Facebook Marketing 0.2.48
    Destination name/version: MS SQL Server 0.1.17
    Step: The issue happens at the end of the sync
    Description: Everything seems to go well, then I get this error and sync fails to complete:
    (‘42000’, ‘[42000] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Error converting data type nvarchar to float. (8114) (SQLMoreResults)
    m
    • 2
    • 15
  • a

    ASulaiman

    05/22/2022, 3:09 PM
    Please help it's my first time installing airbyte on "AmazonService (EKS)", I'm using a s3 bucket to store logs but the problem appears
    "Cannot end publish with com.van.logging.gcp.CloudStoragePublishHelper@13049a9a due to error: Cannot end publishing: Invalid bucket name: '${env:GCS_LOG_BUCKET}'"
    i'm not using GCS bucket and config pointing to GCS is off. this happens only on "pods airbyte-worker"
    Copy code
    apiVersion: apps/v1
    kind: Deployment
    metadata:
      name: airbyte-worker
    spec:
      replicas: 1
      selector:
        matchLabels:
          airbyte: worker
      template:
        metadata:
          labels:
            airbyte: worker
        spec:
          serviceAccountName: airbyte-admin
          automountServiceAccountToken: true
          containers:
            - name: airbyte-worker-container
              image: airbyte/worker
              env:
                - name: AIRBYTE_VERSION
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: AIRBYTE_VERSION
                # - name: CONFIG_ROOT
                #   valueFrom:
                #     configMapKeyRef:
                #       name: airbyte-env
                #       key: CONFIG_ROOT
                - name: DATABASE_HOST
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: DATABASE_HOST
                - name: DATABASE_PORT
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: DATABASE_PORT
                - name: DATABASE_PASSWORD
                  valueFrom:
                    secretKeyRef:
                      name: airbyte-secrets
                      key: DATABASE_PASSWORD
                - name: DATABASE_URL
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: DATABASE_URL
                - name: DATABASE_USER
                  valueFrom:
                    secretKeyRef:
                      name: airbyte-secrets
                      key: DATABASE_USER
                - name: TRACKING_STRATEGY
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: TRACKING_STRATEGY
                - name: WORKSPACE_DOCKER_MOUNT
                  value: workspace
                - name: WORKSPACE_ROOT
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: WORKSPACE_ROOT
                - name: WORKER_ENVIRONMENT
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: WORKER_ENVIRONMENT
                - name: LOCAL_ROOT
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: LOCAL_ROOT
                - name: WEBAPP_URL
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: WEBAPP_URL
                - name: TEMPORAL_HOST
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: TEMPORAL_HOST
                - name: TEMPORAL_WORKER_PORTS
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: TEMPORAL_WORKER_PORTS
                - name: LOG_LEVEL
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: LOG_LEVEL
                - name: JOB_KUBE_NAMESPACE
                  valueFrom:
                    fieldRef:
                      fieldPath: metadata.namespace
                - name: SUBMITTER_NUM_THREADS
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: SUBMITTER_NUM_THREADS
                - name: JOB_MAIN_CONTAINER_CPU_REQUEST
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: JOB_MAIN_CONTAINER_CPU_REQUEST
                - name: JOB_MAIN_CONTAINER_CPU_LIMIT
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: JOB_MAIN_CONTAINER_CPU_LIMIT
                - name: JOB_MAIN_CONTAINER_MEMORY_REQUEST
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: JOB_MAIN_CONTAINER_MEMORY_REQUEST
                - name: JOB_MAIN_CONTAINER_MEMORY_LIMIT
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: JOB_MAIN_CONTAINER_MEMORY_LIMIT
                - name: S3_LOG_BUCKET
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: S3_LOG_BUCKET
                - name: S3_LOG_BUCKET_REGION
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: S3_LOG_BUCKET_REGION
                - name: AWS_ACCESS_KEY_ID
                  valueFrom:
                    secretKeyRef:
                      name: airbyte-secrets
                      key: AWS_ACCESS_KEY_ID
                - name: AWS_SECRET_ACCESS_KEY
                  valueFrom:
                    secretKeyRef:
                      name: airbyte-secrets
                      key: AWS_SECRET_ACCESS_KEY
                - name: S3_MINIO_ENDPOINT
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: S3_MINIO_ENDPOINT
                - name: S3_PATH_STYLE_ACCESS
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: S3_PATH_STYLE_ACCESS
                # - name: GOOGLE_APPLICATION_CREDENTIALS
                #   valueFrom:
                #     secretKeyRef:
                #       name: airbyte-secrets
                #       key: GOOGLE_APPLICATION_CREDENTIALS
                # - name: GCS_LOG_BUCKET
                #   valueFrom:
                #     configMapKeyRef:
                #       name: airbyte-env
                #       key: GCS_LOG_BUCKET
                - name: INTERNAL_API_HOST
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: INTERNAL_API_HOST
                - name: JOB_KUBE_TOLERATIONS
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: JOB_KUBE_TOLERATIONS
                - name: JOB_KUBE_ANNOTATIONS
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: JOB_KUBE_ANNOTATIONS
                - name: JOB_KUBE_NODE_SELECTORS
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: JOB_KUBE_NODE_SELECTORS
                - name: JOB_KUBE_MAIN_CONTAINER_IMAGE_PULL_POLICY
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: JOB_KUBE_MAIN_CONTAINER_IMAGE_PULL_POLICY
                # todo: add other state storage keys
                - name: STATE_STORAGE_MINIO_BUCKET_NAME
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: STATE_STORAGE_MINIO_BUCKET_NAME
                - name: STATE_STORAGE_MINIO_ENDPOINT
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: STATE_STORAGE_MINIO_ENDPOINT
                # - name: STATE_STORAGE_MINIO_ACCESS_KEY
                #   valueFrom:
                #     secretKeyRef:
                #       name: airbyte-secrets
                #       key: STATE_STORAGE_MINIO_ACCESS_KEY
                # - name: STATE_STORAGE_MINIO_SECRET_ACCESS_KEY
                #   valueFrom:
                #     secretKeyRef:
                #       name: airbyte-secrets
                #       key: STATE_STORAGE_MINIO_SECRET_ACCESS_KEY
                - name: CONTAINER_ORCHESTRATOR_ENABLED
                  valueFrom:
                    configMapKeyRef:
                      name: airbyte-env
                      key: CONTAINER_ORCHESTRATOR_ENABLED
              ports:
                - containerPort: 9000 # for heartbeat server
                - containerPort: 9001 # start temporal worker port pool
                - containerPort: 9002
                - containerPort: 9003
                - containerPort: 9004
                - containerPort: 9005
                - containerPort: 9006
                - containerPort: 9007
                - containerPort: 9008
                - containerPort: 9009
                - containerPort: 9010
                - containerPort: 9011
                - containerPort: 9012
                - containerPort: 9013
                - containerPort: 9014
                - containerPort: 9015
                - containerPort: 9016
                - containerPort: 9017
                - containerPort: 9018
                - containerPort: 9019
                - containerPort: 9020
                - containerPort: 9021
                - containerPort: 9022
                - containerPort: 9023
                - containerPort: 9024
                - containerPort: 9025
                - containerPort: 9026
                - containerPort: 9027
                - containerPort: 9028
                - containerPort: 9029
                - containerPort: 9030 # end temporal worker port pool
          #     volumeMounts:
          #       - name: gcs-log-creds-volume
          #         mountPath: /secrets/gcs-log-creds
          #         readOnly: true
          # volumes:
          #   - name: gcs-log-creds-volume
          #     secret:
          #       secretName: gcs-log-creds
    a
    • 2
    • 1
  • s

    Shubhransh Bhargava

    05/23/2022, 5:03 AM
    We have a use case where we want to give the docker image of airbyte to different teams so they can add their own sources, but our requirement is that
    destination
    should be fixed/ or we can have a default destination. Is the use case possible, or they have to add destination everytime?
    m
    • 2
    • 2
1...424344...245Latest