https://linen.dev logo
Join Slack
Powered by
# troubleshooting
  • t

    Thomas Ebermann

    03/21/2022, 10:48 AM
    @Raj where do i swap out this patched fb library in the docker image?
    • 1
    • 3
  • a

    André Santos

    03/21/2022, 12:22 PM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: Mac OS 12.3 Memory / Disk: 32Gb / 1Tb SSD CPU: Apple M1 Pro Deployment: Docker Airbyte Version: 0.35.58-alpha Step: Setting a new connection Description: When I’m trying to create a new connection for mysql it returns me the following error
    Copy code
    Caused by: java.io.IOException: Cannot run program "/tmp/scripts2823960350151157342/image_exists.sh": error=0, Failed to exec spawn helper: pid: 180, exit value: 1
    It seems an issue related with M1 Pro, there is any workaround or fix for it?
    • 1
    • 2
  • u

    user

    03/21/2022, 2:54 PM
    Ask for help submission from @Jeff Crooks *Is this your first time deploying Airbyte * No *OS Version / Instance * AWS EC2 m.Large *Memory / Disk * 80GB *Deployment * Docker *Airbyte Version * 0.35.43-alpha *Source name/version * Stripe 0.1.29 *Destination name/version * Redshift 0.3.28 *Step * Full Refresh *Description * Stripe source is not loading all committed records to target table. Logs indicate subscriptions stream committed 100k+ records, only 33k loaded to database streamName=subscriptions,stats=io.airbyte.config.SyncStats@5d52391c[recordsEmitted=105164,bytesEmitted=297458664,stateMessagesEmitted=<null>,recordsCommitted=105164]], Querying raw tables for the same period, 33,576 records were loaded
    j
    h
    • 2
    • 2
  • s

    Sergi van den Berg

    03/21/2022, 3:22 PM
    Is this your first time deploying Airbyte: No OS Version / Instance: Microsoft Windows 11 Memory / Disk: 16GB / 500GB Deployment: Docker (via Docker Desktop WSL2) Airbyte Version: 0.35.57 Source name: Pipedrive Destination: Redshift Description: Hi guys, I am using the Get a connection API but it returns a 404. I am making a POST request in Postman using the connectionId in the connections URL. Does somebody have prior experience in calling this API who can help me out?
    a
    • 2
    • 10
  • j

    Jason Wiener

    03/21/2022, 5:48 PM
    Good day. Thanks for this community to troubleshoot. I have found a lot of useful information in other people's posts but nothing on this issue that I am encountering. I'd appreciate any insight into the scheduling that's going awry. Is this your first time deploying Airbyte: No Deployment: Kubernetes Airbyte Version: 0.35.42-alpha Source: Occurs with Airbyte Facebook Marketing and custom Python HTTP connector Destination: Redshift via S3 Source name/version: airbyte/source-facebook-marketing 0.2.37 / custom HTTP connector Destination name/version: airbyte/destination-redshift 0.3.23 Description: Configured connections execute as designed when initiated manually but do not execute according to the Replication Frequency specified on the connector. For instance, a custom HTTP connector ran successfully at 1321 on 18 March with Frequency = 8 hours but has not executed in the interim. This is also occurring with an Airbyte Facebook Marketing source and the same destination.
    h
    a
    • 3
    • 6
  • r

    Rajnish Malik

    03/21/2022, 6:24 PM
    Hi Daya/Dharmendra,
  • t

    Thomas Ebermann

    03/21/2022, 7:45 PM
    hey guys i wanted to make facebook-marketing connector work with api v13 so i changed the connector to pull the facebook_marketing lib from a forged lib and now have build the whole airbyte docker image again with gradle. how do i now deploy it not just on my local dev but on my prod box? this is probably a small docker question than anything else e.g. I am using the official https://github.com/airbytehq/airbyte/blob/master/docker-compose.yaml to start it on prod, i guess i need to register my local image somewhere and then have an own docker compose manifest that pulls it from there? (bearbeitet)
    o
    h
    • 3
    • 6
  • j

    Jordan Velich

    03/21/2022, 10:48 PM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: Ubuntu 20.04 Memory / Disk: 64Gb / 2TB HDD Deployment: Docker Airbyte Version: 0.35.18-alpha Source name/version: Custom module Destination name/version: Custom module (Postgres) Description: The jobs for a particular connection keep failing with logs as the attached. It always mentions that the state has been captured, but the
    state
    table is not updated for this connection. Can you please help me as to why this may be the case, and why the sync job is failing in the first place? It appears that it is cancelled from the logs, however, we do not cancel this job.
    h
    • 2
    • 7
  • r

    Rakhi Modi

    03/22/2022, 5:20 AM
    @Javier Llorente Mañas I am also trying to enable gcs logging and remove minio altogether. I am having same configuration as yours. But my scheduler is crashing and the logs say this. What am i missing here?
    Copy code
    2022-03-22 05:10:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):517 - {} - Using default value for environment variable S3_LOG_BUCKET: ''
    2022-03-22 05:10:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):517 - {} - Using default value for environment variable S3_LOG_BUCKET_REGION: ''
    2022-03-22 05:10:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):517 - {} - Using default value for environment variable AWS_ACCESS_KEY_ID: ''
    2022-03-22 05:10:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):517 - {} - Using default value for environment variable AWS_SECRET_ACCESS_KEY: ''
    2022-03-22 05:10:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):517 - {} - Using default value for environment variable S3_MINIO_ENDPOINT: ''
    2022-03-22 05:10:51 INFO i.a.c.EnvConfigs(getEnvOrDefault):517 - {} - Using default value for environment variable GCP_STORAGE_BUCKET: ''
    Exception in thread "main" java.lang.RuntimeException: Error no cloud credentials configured..
            at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:62)
            at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:161)
            at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:148)
            at io.airbyte.scheduler.app.SchedulerApp.main(SchedulerApp.java:198)
    j
    i
    • 3
    • 6
  • a

    ahsen m

    03/22/2022, 6:40 AM
    how can i read minio logs? im on k8's using helm chart.
    o
    a
    • 3
    • 2
  • a

    ahsen m

    03/22/2022, 6:43 AM
    i keep getting this error even though i’ve disabled logs.
    h
    • 2
    • 3
  • n

    Nirmit Jain

    03/22/2022, 7:00 AM
    #troubleshooting I am facing issue while deploying airbyte using kubernetes temporal service is unable to create default namespace, can anybody let me know the possible issue
  • n

    Nirmit Jain

    03/22/2022, 7:01 AM
    ie something like config, may be some sort of external service
    o
    • 2
    • 1
  • o

    Octavia Squidington III

    03/22/2022, 7:12 AM
    loading...
    h
    n
    • 3
    • 9
  • s

    Sergi van den Berg

    03/22/2022, 9:10 AM
    s this your first time deploying Airbyte: No OS Version / Instance: Microsoft Windows 11 Memory / Disk: 16GB / 500GB Deployment: Docker (via Docker Desktop WSL2) Airbyte Version: 0.35.57 Source name: Pipedrive Destination: Redshift Description: Hi guys, I want to update the type of a field in my pipedrive connection using the update API. I would like to change the deal_id and lead_id from integer to string. I have no prior experience using this API and I don't know whether I should pass the entire connection's object in the body, or just the part I would like to update. Does somebody know what you should pass in the API's body, unfortunately I keep getting invalid JSON errors. Thanks!
  • s

    Sergi van den Berg

    03/22/2022, 9:20 AM
    @Marcos Marx (Airbyte) Do you have more extensive documentation on how to use the update API, it is a bit unclear for me still. Thanks a lot!
    m
    • 2
    • 2
  • s

    Sophie Lohezic

    03/22/2022, 9:26 AM
    log with 0 record (0.4.9) (Skipping stream public.xxxxxbecause it is not in the source)
    a
    • 2
    • 3
  • r

    raju

    03/22/2022, 1:23 PM
    Hi Dev Team / Airbyte users , Is there a way we can filter on a source column. We have a use case where we have a huge source table and we want to minimize the data transfer.
    a
    • 2
    • 2
  • a

    Augustin Lafanechere (Airbyte)

    03/22/2022, 1:58 PM
    Hey Guillaume, glad you were able to solve your env problem! Thank you for the awesome debugging on your issue. I commented on it to answer your question.
  • l

    luisa zuluaga

    03/22/2022, 2:58 PM
    Hi guys, i have to load information in snowflake with tables that already exist, we use sequence and other fileds that we can't just recreate the table, it is possible to add information in an existing table ?
    o
    a
    • 3
    • 2
  • a

    ahsen m

    03/22/2022, 3:26 PM
    Hello team, is my config for minio correct, im unable to see logs at minio and it keeps saying
    Internal Server Error: The AWS Access Key Id you provided does not exist in our records. (Service: S3, Status
    Copy code
    logs:
      accessKey:
        password: <redacted> AWS Key
        existingSecret: ""
        existingSecretKey: ""
      secretKey:
        password: <redacted> AWS SEcret
        existingSecret: ""
        existingSecretKey: ""
      minio:
        enabled: false
      externalMinio:
        enabled: true
        host: minio.default.svc.cluster.local
        port: 9000
      s3:
        enabled: true
        bucket: reda3231-airbyte-dev-logs
        bucketRegion: "us-east-1"
    minio:
      MINIO_ROOT_USER: minio
      MINIO_ROOT_PASSWORD: minio123
      extraEnv:
        - name: MINNIO_LOG_LEVEL
          value: DEBUG
  • a

    ahsen m

    03/22/2022, 3:39 PM
    is there any local path on k8 that i can read minio log ?
  • r

    Rishika

    03/22/2022, 4:04 PM
    Hello team, We have written a custom connector for getting data from a REST Api. We are getting airbytecdk not found error while testing the connector. Who can help us?
    o
    a
    • 3
    • 2
  • a

    ahsen m

    03/22/2022, 4:35 PM
    so i am unable to see further what went wrong on connection to postgres, even the log file mentioned has the same output in picture. no way to know if password is wrong or whats wrong?
  • a

    ahsen m

    03/22/2022, 5:45 PM
    Caused by: io.temporal.failure.ApplicationFailure: message='/root/.kube/config (No such file or directory)', type='<http://java.io|java.io>.FileNotFoundException', nonRetryable=false
  • l

    Laurentiu Soica

    03/22/2022, 6:38 PM
    Hello everyone, did a quick integration for a rest api source hit this https://github.com/airbytehq/airbyte/issues/11276 most probably i'm doing smtg wrong or there's some concepts I misunderstood, but I can't manage to automatically sync the rest api source
    m
    • 2
    • 2
  • s

    Saya

    03/22/2022, 6:55 PM
    Is this your first time deploying Airbyte: yes OS Version / Instance: Amazon Linux 2 / AWS EC2 t2.large Deployment: Docker Airbyte Version: 0.35.56-alpha Source/version: Marketo Description: Having issues with setting up Marketo Connection. Followed all the steps outlined in the Marketo Setup Guide but getting an error
    Exception("Error while refreshing access token: 'access_token'")
    there is no field for access token, was this supposed to be entered anywhere?
  • j

    Jordan Scott

    03/22/2022, 7:29 PM
    I’m not sure if it’s related, but I can’t get the BigQuery Denormalized destination to work from the Instagram source either, but mine says `sync worker failed.`:
    t
    m
    h
    • 4
    • 9
  • e

    Emily Cogsdill

    03/22/2022, 8:22 PM
    • Airbyte version: 0.35.27-alpha • OS Version / Instance: Ubuntu VM / GCP n1-standard-2 (2 vCPUs, 7.5 GB memory) • Deployment: Docker • Source Connector and version: S3 0.1.10 • Destination Connector and version: BigQuery 0.6.7 • Severity: Medium • Step where error happened: Loading data from source Hi team! I am trying to run a sync from an S3 source to BigQuery. The data actually seem to be getting delivered successfully to the destination, but I am still seeing an error in the logs, and the app is marking the sync as “Failed” and always retries twice before finally cancelling. The error I am seeing is:
    Copy code
    pyarrow.lib.ArrowInvalid: CSV parser got out of sync with chunker
    Does anyone have thoughts on what might be going on here & how to troubleshoot? I am running several other similar S3->BigQuery syncs from the same bucket (but with different file paths) without issue - only this one is throwing an error.
    m
    • 2
    • 5
  • i

    Ile Lee

    03/22/2022, 11:20 PM
    Hi everyone. I have a ticket where tables in snowflake are all UPPERCASE except for one table. I haven't had an update yet, could anyone help me ? https://github.com/airbytehq/airbyte/issues/10572
    m
    • 2
    • 1
1...678...14Latest