https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Arik Elbag

    02/22/2023, 8:37 PM
    Hello!
  • a

    Arik Elbag

    02/22/2023, 8:38 PM
    Question for those who have connected quickbooks using Opensource; what is the master key to use? I am assuming ID only but wanted to confirm
  • j

    Justen Walker

    02/22/2023, 8:41 PM
    What's the correct way to hook up Airbyte to OTEL inside Kubernetes? I'm guessing that setting the
    metrics.metricsClient: otel
    but the
    OTEL_COLECTOR_ENDPOINT
    is sort of dynamic since it needs to be set to the host IP of the pod -- ie: it can't come directly from the configmap
    a
    w
    • 3
    • 12
  • k

    Kumar K

    02/22/2023, 9:28 PM
    Hi Team, i have configured the airbyte, while I am trying to login its prompting for username and password? can someone please help me ?
    m
    • 2
    • 3
  • i

    Ignacio Alasia

    02/22/2023, 10:00 PM
    Good afternoon everyone! We are running Airbyte on docker and since Saturday morning we are experiencing Postgres connector failures using the CDC. It says lack of permits, we review the documentation, we follow step by step what is indicated and it fails. We also tried assigning it a superuser and it didn't work. Does anyone else happen? I share the code for the user.
    Copy code
    CREATE USER "shared-h3-airbyte" WITH PASSWORD '**********';
    GRANT USAGE ON SCHEMA public TO "shared-h3-airbyte";
    GRANT SELECT ON TABLE public.addresses_cells TO "shared-h3-airbyte";
    GRANT SELECT ON TABLE public.coverages_cells TO "shared-h3-airbyte";
    ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT SELECT ON TABLES TO "shared-h3-airbyte";
    GRANT rds_replication TO "shared-h3-airbyte";
    
    SELECT pg_drop_replication_slot('airbyte_shared_h3');
    
    SELECT pg_create_logical_replication_slot('airbyte_shared_h3', 'pgoutput');
    
    ALTER TABLE addresses_cells REPLICA IDENTITY DEFAULT;
    ALTER TABLE coverages_cells REPLICA IDENTITY DEFAULT;
    
    DROP PUBLICATION airbyte_publication_shared_h3;
    
    CREATE PUBLICATION airbyte_publication_shared_h3 FOR TABLE addresses_cells,coverages_cells;
    j
    p
    • 3
    • 17
  • k

    Konstantin Lackner

    02/22/2023, 11:50 PM
    Hi, I want to setup the GA4 as a Source, however I'm running into the following issue:
    Copy code
    HTTPError('403 Client Error: Forbidden for url: <https://analyticsdata.googleapis.com/v1beta/properties/XXXXX/metadata>')
    I already found this related thread: https://discuss.airbyte.io/t/ga4-connector-connection-test-fails/3439/2 But I added the user in Google Analytics on the account level. So I'm not sure what's wrong here...
    • 1
    • 1
  • m

    Manav Kothari

    02/23/2023, 3:57 AM
    Hi, for mongo as a source connector, I am getting this error 'PlanExecutor error during aggregation :: caused by :: Exceeded memory limit for $group, but didn't allow external sort. Pass allowDiskUse:true to opt in.' How do I allowDiskUse: true is there any configuration? - dockerImage: "airbyte/source-mongodb-v2:0.1.19"
    Copy code
    Something went wrong in the connector. See the logs for more details.
    Stack Trace: com.mongodb.MongoCommandException: Command failed with error 292 (QueryExceededMemoryLimitNoDiskUseAllowed): 'PlanExecutor error during aggregation :: caused by :: Exceeded memory limit for $group, but didn't allow external sort. Pass allowDiskUse:true to opt in.' on server <http://cluster0-shard-00-01.ucrrp.mongodb.net:27017|cluster0-shard-00-01.ucrrp.mongodb.net:27017>. The full response is {"ok": 0.0, "errmsg": "PlanExecutor error during aggregation:: caused by :: Exceeded memory limit for $group, but didn't allow external sort. Pass allowDiskUse:true to opt in.", "code": 292, "codeName": "QueryExceededMemoryLimitNoDiskUseAllowed", "$clusterTime": {"clusterTime": {"$timestamp": {"t": 1677123912, "i": 96}}, "signature": {"hash": {"$binary": {"base64": "ZuTk1CUhOayPeR/0ir98n3rSvOA=", "subType": "00"}}, "keyId": 7168578009350275074}}, "operationTime": {"$timestamp": {"t": 1677123912, "i": 96}}}
    	at com.mongodb.internal.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:198)
    c
    • 2
    • 4
  • g

    Gautam B

    02/23/2023, 5:33 AM
    where can we get the default values
    syncCatalog
    for a connector when using the Airbyte API
    *POST* /v1/connections/create
    . We are provisioning various 3rd party apps in Airbyte using API and right now we are not able to find the syncCatalog for each app. We are creating the connector in ui and taking the schema from the db, but this is just a hack. Is there a place where we can get the base/default schema definition for each connector?
    n
    • 2
    • 4
  • l

    Le Minh Nguyen

    02/23/2023, 6:53 AM
    Hi, anyone got experience with connecting HubSpot with BigQuery and using Normalization process? keep reiceiving these 2 erros: • Database Error in model companies_properties (models/generated/airbyte_incremental/hubspot_crm/companies_properties.sql): Bad double value: N/A • The view is too large. The maximum standard SQL view length is 256.000K characters, including comments and white space characters.: compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_views/hubspot_crm/deals_properties_ab2.sql I understand that for the first case the data mapping itself has issue and in the 2nd case deals properties are simply too large. But I am quite unsure how to fix this
    u
    • 2
    • 1
  • u

    김건희

    02/23/2023, 7:06 AM
    hi, i'm trying to use airbyte to connect airtable. Before connector version, i can connect airbyte to airtable using personal acceess token. But i got an error message like this. I'm using airbyte 0.40.32 version on eks deploy using helm. why this happen? And if want to use athentication - OAuth2.0, what values needed for client id and client secret, refresh token? Let me know. Thanks
    Copy code
    2023-02-23 06:16:28 ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):163 - Check failed
    p
    • 2
    • 2
  • n

    Narendran Omprakash

    02/23/2023, 8:03 AM
    Hello, I want to contribute to the SendGrid Airbyte connector. I modified the connector’s code a bit to support more entities (entities that return a single object instead of a list of objects). Can I open a new issue or comment under this issue: https://github.com/airbytehq/airbyte/issues/2117
    n
    • 2
    • 1
  • x

    xi-chen.qi

    02/23/2023, 9:20 AM
    hi, teams. I want to deploy airbyte through kubernetes or kubernetes helm, but I can’t find custom files, such as: https://github.com/airbytehq/airbyte.git/kube/overlays/stable?ref=master and https: <//github.com/airbytehq/airbyte/blob/master/charts/airbyte/values.yaml>
    m
    s
    +5
    • 8
    • 20
  • j

    Jana Kniel

    02/23/2023, 10:54 AM
    Hey team, i´d like to ask if there is a plan to get the table "conversion" for the LinkedIn Ads connection. Thank you for your help/information :)
    plus1 1
    n
    • 2
    • 3
  • b

    Bevis Lin

    02/23/2023, 11:25 AM
    Hi Team, I'm writing schema.json for my new api source and sync to MySQL with basic normalization. Is it support nested array object that more than 3 layer ? When it sync to MySQL, the hash-id column can map first table and second table but not for second to third. My api response would be like:
    Copy code
    {
      "size": 0,
      "offset": 0,
      "stores": [
        {
          "id": "S0000",
          "name": "xxx shop",
          "foods": [
            {
              "food_id": "f12345",
              "food_nm": "Apple",
              "sold_out": true
            }
          ],
          "create_date": "2019-12-28",
          "update_date": "2023-02-23"
        },
        {
          "id": "S0001",
          "name": "yyy shop",
          "foods": [
            {
              "food_id": "f11111",
              "food_nm": "Banana",
              "sold_out": false
            }
          ],
          "create_date": "2012-01-22",
          "update_date": "2023-02-23"
        }
      ]
    }
    • 1
    • 1
  • t

    Thiago Villani

    02/23/2023, 11:56 AM
    Hello, I'm using airbyte in a local vm with docker, there is a moment when I run simultaneous jobs, the cpu and memory goes to the limit, the .env file has some parameterizations, does anyone have a model, or how should I leave the settings. I have a vm with 2 cpu and 8gb of memory for example: JOB_MAIN_CONTAINER_MEMORY_REQUEST=4g JOB_MAIN_CONTAINER_MEMORY_LIMIT= 4g JOB_MAIN_CONTAINER_CPU_REQUEST= 0.75 JOB_MAIN_CONTAINER_CPU_LIMIT= 0.80
    👀 1
    m
    w
    a
    • 4
    • 3
  • d

    Daniel Pietschmann

    02/23/2023, 1:34 PM
    Hey, I installed airbyte on a virtual machine in a cloud. Now I am trying to do custom transformations with dbt. When I add my dbt repository url in the airbyte ui I was wondering if I have to use https or if I can use ssh as well? Can someone give me a hint?
    u
    • 2
    • 1
  • j

    Jason Vondersmith

    02/23/2023, 1:59 PM
    Good Morning! I'm trying to use the
    Kustomer
    connector, currently in Alpha. I see there was a singer/user-agent issue with it originally. But then I found this PR that seems like it would correct the implementation. Is there a way to know when that PR might get merged?
  • f

    Francisco Andres

    02/23/2023, 2:04 PM
    Hi, We are using airbyte in a vm and I am doing a connection between MSSQL and Big Query. When I try a Normalized tabular data transformation I am getting the next error "Failed to decode invalid base64 string,externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself." The field original type is binary(8) 1. It there any way to avoid load this column or indicate airbyte to load it as string from origin whatever it gets? 2. Can the problem be avoid doing a Custom Transformations with dbt? Thanks in advance
    n
    • 2
    • 3
  • a

    Arman S

    02/23/2023, 2:21 PM
    Hi guys, I am new to Airbyte and have been trying to install it in my local K8s cluster for testing. Here are the steps that I have followed: • Step 1: Install Kubernetes and Helm and add the Airbyte Helm repository:
    Copy code
    helm repo add airbyte <https://airbyte.github.io/airbyte>
     helm repo update
    • Step 2: Create a namespace for Airbyte:
    Copy code
    kubectl create namespace airbyte
    • Step 3: Install Airbyte using the Helm chart
    Copy code
    helm install airbyte airbyte/airbyte -n airbyte -f ./helm/values.yaml
    • Step 4: Forward ports
    Copy code
    kubectl --namespace airbyte port-forward service/airbyte-airbyte-webapp-svc 8000:80
    My values.yaml file looks like this:
    Copy code
    airbyte:
      deployment:
        replicas: 1
      service:
        type: ClusterIP
      ingress:
        enabled: false
      persistence:
        enabled: true
        size: 200MB
      image:
        tag: 0.31.3
        pullPolicy: IfNotPresent
      worker:
        enabled: true
        command: ["/bin/bash", "-c"]
        args:
          - |
            set -ex
            /airbyte/integration/bin/entrypoint.sh
        resources:
          requests:
            cpu: 500m
            memory: 512Mi
          limits:
            cpu: 1
            memory: 1Gi
    However, I am struggling to understand why the Worker pod doesn't spin up. The logs say the following:
    Stream closed: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"container \"airbyte-worker-container\" in pod \"airbyte-worker-767db5b6c7-k78cg\" is waiting to start: CreateContainerConfigError","reason":"BadRequest","code":400}
    Is this something that anyone could help me with? I was also wondering if I need the cron and temporal deployments?
    p
    k
    +2
    • 5
    • 11
  • c

    Claudio Cavallo

    02/23/2023, 2:38 PM
    good morning !, im trying to appy all, the connections, sources, and destinations, when I run octavia apply, the sources and destinations works, but te connection between them fails:
    Copy code
    irbyte_api_client.exceptions.ApiTypeError: Invalid type for variable 'non_breaking_changes_preference'. Required value type is NonBreakingChangesPreference and passed type was str at ['non_breaking_changes_preference']
    m
    • 2
    • 1
  • c

    Claudio Cavallo

    02/23/2023, 2:39 PM
    I'm using octavia-cli 0.40.32 and Aribyte 0.40.32
  • k

    Konstantin Lackner

    02/23/2023, 4:25 PM
    Good morning, I'm having an issue with the GA4 connector. I am getting the schemas into BigQuery for all streams, however I'm only getting data for one stream
    daily_active_users
    . All other tables are empty. Can anyone help?
    n
    • 2
    • 5
  • l

    Lior Shkiller

    02/23/2023, 6:07 PM
    Hey, we are getting a repeated exception for hubspot:
    i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed.
    errors: $: null found, string expected
    What can be the reason? Hubspot version: 0.3.2 Airbyte version: 0.40.23
    m
    • 2
    • 2
  • j

    José Lúcio Zancan Júnior

    02/23/2023, 6:52 PM
    Hey everyone. Can I ask a question about a specific connector (TikTok Marketing) here or is it better to ask in the repository/connector issues?
    m
    • 2
    • 1
  • j

    José Lúcio Zancan Júnior

    02/23/2023, 7:49 PM
    I have a connection using TikTok Marketing as a source and BigQuery as a destination. I'm pulling the
    ads_reports_daily
    stream, which according to the connector docs:
    For example, if you select the daily-aggregation flavor of a report, the report will contain a row for each day for the duration of the report. Each row will indicate the number of impressions recorded on that day.
    My issue is that in the destination dataset, I'm only seeing data for that day of each ad. (If I ran a sync yesterday, I can only see the yesterday's clicks/impressions/conversions. If I run a sync again today, yesterday's data is deleted and I start seeing only today's values.) I suspect this behavior is caused by the sync mode, which is set to "Incremental | Deduped + history", with
    stat_time_day
    as cursor and
    ad_id
    as primary key. But: 1.
    stat_time_day
    aren't suposed to be a primary key too? So the transformation can keep one entry for each day AND ad_id? 2. If I change the sync mode to "Incremental | Append", my problem is solved (kind of), but then I would make room for duplicates and missing data (days skipped due to errors or unavailability) 3. My guess is that for what I need, the "Incremental | *Deduped + history*" should work, but then I would need to set the
    stat_time_day
    as the primary key as well, which seems to be impossible, since this field cannot be changed in the connector replication settings. Any tips or anyone in the same situation? Thanks in advance.
    n
    r
    • 3
    • 7
  • a

    An R

    02/23/2023, 7:54 PM
    Hi, We're deploying Airbyte in K8s using the stable overley for now (we'll use helm in the future). When applying stable config via
    kubectl apply -k kube/overlays/stable
    eventually a worker pod fails with a message similar to
    ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: null
    , This error is preceded by a warning:
    Copy code
    i.a.m.l.MetricClientFactory(initialize):74 - MetricClient was not recognized or not provided. Accepted values are `datadog` or `otel`.
    Is there someway to disable METRIC_CLIENT? We've tried different values for METRIC_CLIENT (we've tried
    none
    ,
    false
    and empty) but the pod ends up failing) We don't use Datadog or Otel, and leaving value empty for METRIC_CLIENT seems not tgo be a solution What could we do? Thanks!
    u
    • 2
    • 1
  • c

    charlie song

    02/23/2023, 8:49 PM
    Hi, I am trying to use Airbyte Open-Source, following instructions from https://docs.airbyte.com/quickstart/deploy-airbyte/, after i run "run-ab-platform.sh", i get small window pop-up that quickly appears then disappear, nothing happens, going to http://localhost:8000 gives "this site can't be reached". Would really appreciate your help.
    a
    • 2
    • 2
  • s

    Shailendra Jaiswar

    02/23/2023, 9:54 PM
    Hi All 👋, I am new to AirByte. I have deployed Airbyte Open-Source v0.43.22 in a GKE env v1.21.14-gke.7100 using helm chart. I got to know that there is an existing issue with deployment with istio sidecar. ref: https://discuss.airbyte.io/t/airbyte-issues-with-istio-sidecar/383 So, I deployed all modules successfully by removing the istio sidecar container in each pod (It was created by default due to some policies in our K8s cluster). When I create a new source connection through UI then a pod is created with a default istio sidecar container. The default container causes issues to connect the worker module with the pod log error: 😞 Using existing AIRBYTE_ENTRYPOINT: /airbyte/base.sh Waiting on CHILD_PID 7 PARENT_PID: 1 Heartbeat to worker failed, exiting… received ABRT However, the default container can be removed by applying annotations. I don’t identify the job/template from where this pod is being created. So, I would like to know the following queries. • Which module creates a connection pod? • Where is the pod template defined in the helm chart for the connections? Thanks
    m
    u
    • 3
    • 4
  • m

    Malik

    02/23/2023, 11:21 PM
    There is a broken link on this page https://docs.airbyte.com/deploying-airbyte/on-kubernetes-via-helm/ Deploy Airbyte > Custom Deployment > charts/airbyte Could you please point me to the correct values.yaml file?
    x
    s
    s
    • 4
    • 3
  • a

    Arik Elbag

    02/23/2023, 11:45 PM
    Having trouble setting up Quickbooks; followed all steps and this is error im getting
    Copy code
    Last attempt:
    NaN Bytes | no records | no records | 2m 28s | Sync
    Failure Origin: normalization, Message: Something went wrong during normalization
    4:03PM 02/22
    3 attempts
    
    2023-02-23 00:13:46 - Additional Failure Information: message='io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Normalization Failed.', type='java.lang.RuntimeException', nonRetryable=false
1...149150151...245Latest