https://linen.dev logo
Join Slack
Powered by
# troubleshooting
  • p

    Parth Gangrade

    03/31/2022, 1:14 PM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: MacOS Monterey, instance built through GCP , E2 medium running Debian 10 Memory / Disk: 2 vCPUs, 4GB memory Deployment: Docker Airbyte Version: latest, or whichever is used for deploying to GCP in the instructions on https://docs.airbyte.com/deploying-airbyte/on-gcp-compute-engine Source name/version: not there yet Destination name/version: not there yet Step: Getting started Description: Having trouble launching localhost:8000 after having installed gcloud, docker, docker-compose and airbyte to the instance. any browser I visit is throwing the same “connection refused” or “refused to connect” error
    o
    a
    • 3
    • 3
  • w

    Will Massey

    03/31/2022, 1:32 PM
    Is this your first time deploying Airbyte: No OS Version / Instance: GCP, e2-medium running Debian-10 Memory / Disk: - Deployment: Docker Airbyte Version: 0.35.31-alpha Source name/version: Hubspot 1.51 (I have also tried other earlier versions: 0.1.7 & 0.1.33) Destination name/version: BigQuery 0.6.8 Step: When syncing deal_pipelines Description: When syncing deal_pipelines from Hubspot, Airbyte only sends the final record to the destination even though I know there are more records in deal_pipeline. This is definitely an issue with the source connection as the logs report
    Total records read: 1
    . I have also tried manually connecting to the API endpoint (https://api.hubapi.com/crm-pipelines/v1/pipelines/deals) using the requests library in Python and the response includes all the details. I’m not sure what is going on and what the issue is, but perhaps it is a problem with how the schema is generated. Any help would be much appreciated!
    o
    • 2
    • 1
  • m

    Marcos Marx (Airbyte)

    03/31/2022, 3:00 PM
    If you encounter any issues using Airbyte, check out our Troubleshooting forum. You’ll see how others have got their issues resolved, and our team will be there to assist if your issue hasn’t been encountered yet.
  • c

    Colton Pomeroy

    03/31/2022, 9:14 PM
    We're having trouble pulling in data with via the Google Ads connector. We see data on our end, but the connector shows no records. The account is a test account. Are test account supported?
    m
    • 2
    • 3
  • w

    Will Moore

    03/31/2022, 9:45 PM
    Hello! Looking for help getting the WooCommerce plugin working with our hosted version, unsure how to add it so it can appear in our Airbyte console.
    • 1
    • 1
  • w

    Will Moore

    03/31/2022, 10:50 PM
    Okay now its failing 403 , but all credentials are right, verified with postman
    • 1
    • 1
  • e

    Emily Cogsdill

    03/31/2022, 10:54 PM
    not sure if this is the best channel for this, but: Has anyone here successfully implemented Sentry to monitor errors from Airbyte? I followed the instructions here and changed the SENTRY_DSN value in the .env file to match the one listed with my Sentry project, but sync failures don’t seem to emit events to Sentry. Is there some configuration step I am missing, either within Airbyte or Sentry?
    o
    o
    m
    • 4
    • 6
  • t

    Thiago Costa

    04/01/2022, 3:59 AM
    Hi, we recently updated our airbyte servers to the latest version and since then we've noticed that the S3 destination writing parquet files doesn't follow the naming convention from before the update. Seems like the new buffer file implementation changed something there
    z
    s
    +2
    • 5
    • 28
  • d

    Dejan Antonic

    04/01/2022, 9:21 AM
    hi team, We have minio pod with error status, and logs say no space left on device, is there anything else we can do except raising the PV claim?
    o
    a
    • 3
    • 3
  • t

    Trev Killick

    04/01/2022, 11:41 AM
    Is this your first time deploying Airbyte: Yes OS Version / Instance: Mac OS Memory / Disk: 16Gb / 1Tb SSD Deployment: Docker Airbyte Version: 0.35.63-alpha Source name/version: HubSpot 0.1.51 Destination name/version: BigQuery 1.0.1 Step: On sync Description: Whilst using my HubSpot API key which tested fine after testing up the source, I get the following error on Sync:
    errors: $.client_id: is missing but it is required, $.client_secret: is missing but it is required, $.refresh_token: is missing but it is required, $.credentials_title: must be a constant value OAuth Credentials, $.credentials_title: does not have a value in the enumeration [OAuth Credentials]
    • 1
    • 1
  • s

    Sharal Pinto

    04/01/2022, 3:47 PM
    HI @Erica Struthers @Ari Bajo(FYI) I'm facing some issues when trying to sync Hubspot (source) to BigQuery (destination). Was able to setup the connection and everything without any issues, but it fails while syncing. Here is the screenshot of the error message and logs. Please assist. Thanks
  • w

    Will Moore

    04/01/2022, 4:09 PM
    Where is best place to get support for WooCommerce plugin?
    o
    • 2
    • 1
  • z

    Zlatan Ivanov

    04/01/2022, 5:04 PM
    Is this your first time deploying Airbyte: No Deployment: Docker Airbyte Version: 0.32.4-alpha Source name/version: Google Ads Destination name/version: BigQuery Step: Sync GAQL query. Description: Guy, I've tried to load several Google Ads Queries through the GAQL, but none of them are loading any rows of data. Here's my super basic query:
    Copy code
    SELECT segments.ad_destination_type FROM campaign
    It still loads 0 rows of data. When I select other tables, they load thousands of rows. I am not sure what is the issue. Could anyone give some hints?
  • m

    Matheus Guinezi

    04/01/2022, 5:07 PM
    Hi guys, how can I rebuild aibyte docker from scratch, removing all cache and previous builds?
    z
    • 2
    • 1
  • j

    Jing Zhang

    04/01/2022, 6:29 PM
    👋 Hello, team! I'm trying to set up GSM through the environment variables
    SECRET_STORE_GCP_PROJECT_ID
    and
    SECRET_STORE_GCP_CREDENTIALS
    . How do I provide the credentials? Right now I'm pasting the whole json in the
    .env
    file
    Copy code
    SECRET_STORE_GCP_CREDENTIALS={"type": "service_account","project_id":"my_project","private_key_id": "acd90445....","private_key":"...",...}
    but then I got an error when setting up a Postgres source as shown in the screenshot. Is this related?
    f
    j
    • 3
    • 3
  • e

    Eric Santulli

    04/01/2022, 6:40 PM
    Hey everyone, I am working to setup a HubSpot Source. I have figured out the setup through the API key and Public Apps; however, for our use case HubSpot's Private Apps seem to fit best. From my testingI believe this is not supported in the Airbyte connector though. Could anyone confirm this?
  • k

    Kev Daly

    04/01/2022, 8:27 PM
    Hi all, I am seeing
  • m

    Marcos Marx (Airbyte)

    04/01/2022, 8:30 PM
    If you encounter any issues using Airbyte, check out our Troubleshooting forum. You’ll see how others have got their issues resolved, and our team will be there to assist if your issue hasn’t been encountered yet.
  • k

    Kev Daly

    04/01/2022, 8:36 PM
    Is this your first time deploying Airbyte: No Deployment: Docker Airbyte Version: 0.35.3-alpha Source name/version: PostgreSQL / Redshift Destination name/version: Snowflake Step: Loading from temp tables Description: Hi all, I am experiencing this issue as I start to set up my connection to Snowflake via Airbyte. I was loading normally before, but decided to make changes to the destination structure (rename database, etc.) and updated the connection in Airbyte accordingly. Now experiencing this issue, which seems to be with Snowflake connection between when the staging tables load and are trying to load to production tables. Testing the destination connection has no issues. Any advice would be a huge help - thank you!
    Copy code
    2022-04-01 18:26:07 normalization >   File ".../transform.py", line 198, in transform_snowflake
    2022-04-01 18:26:07 normalization >     "password": config["password"],
    2022-04-01 18:26:07 normalization > KeyError: 'password'
    2022-04-01 18:26:07 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$2):177 - Completing future exceptionally...
    io.airbyte.workers.WorkerException: Normalization Failed.
  • m

    Matheus Guinezi

    04/01/2022, 9:36 PM
    Hi guys! I am trying to change facebook_business python library from v12 to v13 in source-facebook-marketing setup.py installation, but it keeps giving the same error, even when I start from scratch and rebuild containers and images. The way I am doing to get the lib from git is correct? Is there any cache that could prevent me from getting this facebook_business source code?
  • b

    Bryan Estrada

    04/01/2022, 10:36 PM
    How can I change the HTTP port (instead of 8000, I want to use 8080)?
  • a

    Ashish Bansal

    04/02/2022, 2:47 AM
    Is this your first time deploying Airbyte: no Deployment: Kubernetes Source: MySQL (5.6) Destination: Clickhouse Description:
    Copy code
    2022-04-02 02:27:49 [32mINFO[m i.a.w.p.KubePodProcess(close):695 - (pod: analytics / source-mysql-sync-3-0-xgbwt) - Closed all resources for pod
    2022-04-02 02:27:49 [32mINFO[m i.a.w.DefaultReplicationWorker(lambda$getReplicationRunnable$5):305 - Total records read: 0 (0 bytes)
    2022-04-02 02:27:50 [32mINFO[m i.a.w.p.KubePodProcess(close):695 - (pod: analytics / destination-clickhouse-sync-3-0-uynsd) - Closed all resources for pod
    2022-04-02 02:27:50 [1;31mERROR[m i.a.w.DefaultReplicationWorker(run):169 - Sync worker failed.
    java.util.concurrent.ExecutionException: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1
    	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    	at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:162) ~[io.airbyte-airbyte-workers-0.35.64-alpha.jar:?]
    	at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.64-alpha.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.64-alpha.jar:?]
    	at java.lang.Thread.run(Thread.java:833) [?:?]
    	Suppressed: io.airbyte.workers.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
    		at io.airbyte.workers.protocols.airbyte.DefaultAirbyteSource.close(DefaultAirbyteSource.java:136) ~[io.airbyte-airbyte-workers-0.35.64-alpha.jar:?]
    		at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:126) ~[io.airbyte-airbyte-workers-0.35.64-alpha.jar:?]
    		at io.airbyte.workers.DefaultReplicationWorker.run(DefaultReplicationWorker.java:57) ~[io.airbyte-airbyte-workers-0.35.64-alpha.jar:?]
    		at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$2(TemporalAttemptExecution.java:155) ~[io.airbyte-airbyte-workers-0.35.64-alpha.jar:?]
    		at java.lang.Thread.run(Thread.java:833) [?:?]
    Caused by: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1
    	at io.airbyte.workers.DefaultReplicationWorker.lambda$getReplicationRunnable$5(DefaultReplicationWorker.java:312) ~[io.airbyte-airbyte-workers-0.35.64-alpha.jar:?]
    	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?]
    	... 1 more
    2022-04-02 02:27:50 [32mINFO[m i.a.w.DefaultReplicationWorker(run):228 - sync summary: io.airbyte.config.ReplicationAttemptSummary@3f52e1b1[status=failed,recordsSynced=0,bytesSynced=0,startTime=1648866152849,endTime=1648866470892,totalStats=io.airbyte.config.SyncStats@64ef342d[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]]
    2022-04-02 02:27:50 [32mINFO[m i.a.w.DefaultReplicationWorker(run):250 - Source did not output any state messages
    2022-04-02 02:27:50 [33mWARN[m i.a.w.DefaultReplicationWorker(run):261 - State capture: No state retained.
    2022-04-02 02:27:50 [32mINFO[m i.a.w.t.TemporalAttemptExecution(get):131 - Stopping cancellation check scheduling...
    2022-04-02 02:27:50 [32mINFO[m i.a.w.t.s.ReplicationActivityImpl(lambda$replicate$1):147 - sync summary: io.airbyte.config.StandardSyncOutput@58afd5e4[standardSyncSummary=io.airbyte.config.StandardSyncSummary@2be1cca4[status=failed,recordsSynced=0,bytesSynced=0,startTime=1648866152849,endTime=1648866470892,totalStats=io.airbyte.config.SyncStats@64ef342d[recordsEmitted=0,bytesEmitted=0,stateMessagesEmitted=0,recordsCommitted=0],streamStats=[]],state=<null>,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@25052332[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@28e51ac6[stream=io.airbyte.protocol.models.AirbyteStream@e43eacb[name=foo_analysis_codecount,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"date":{"type":"string"},"count":{"type":"number"},"user_id":{"type":"number"},"_ab_cdc_log_pos":{"type":"number"},"_ab_cdc_log_file":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=foo,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@2b62934b[stream=io.airbyte.protocol.models.AirbyteStream@27f239a6[name=foo_upc_upcsurveyresponse,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"user_id":{"type":"number"},"response":{"type":"string"},"creation_date":{"type":"string"},"response_type":{"type":"number"},"updation_date":{"type":"string"},"_ab_cdc_log_pos":{"type":"number"},"_ab_cdc_log_file":{"type":"string"},"_ab_cdc_deleted_at":{"type":"string"},"_ab_cdc_updated_at":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=true,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=foo,additionalProperties={}],syncMode=incremental,cursorField=[],destinationSyncMode=append_dedup,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}],failures=[io.airbyte.config.FailureReason@27c134b4[failureOrigin=source,failureType=<null>,internalMessage=io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@36458735[additionalProperties={attemptNumber=0, jobId=3}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1
    	at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315)
    	at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320)
    	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807)
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    	at java.base/java.lang.Thread.run(Thread.java:833)
    Caused by: io.airbyte.workers.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1
    	at io.airbyte.workers.DefaultReplicationWorker.lambda$getReplicationRunnable$5(DefaultReplicationWorker.java:312)
    	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)
    	... 3 more
    ,retryable=<null>,timestamp=1648866469670]]]
    This means that the worker responsible for reading source failed, is that correct ? Is there a way I can look into its logs how it failed and try to debug it ? I checked `source-mysql-sync`’s containers but they don’t output any information.
  • k

    Kyle Cheung

    04/02/2022, 10:27 PM
    Anyone else encountering a bug with Zendesk Support 0.2.3 (I just upgraded and had to downgrade back to 0.1.12) where the jobs just fail and the final message is $method? Can post logs shortly
    s
    o
    • 3
    • 4
  • t

    TG

    04/03/2022, 11:45 AM
    Finding it hard to fix this: I am trying to store Minio logs to GCS 1. Created SA 2. Assigned storage.admin role 3. created bucket 4. create configmap to pass the serviceaccount-secret.json and mounted as volume to pods 5. updated helm chart to as following helm-values
    Copy code
    externalMinio:
        enabled: false
        host: localhost
        port: 9000
      s3:
        enabled: false
        #bucket: airbyte-dev-logs
        bucket: airbyte-logs-prod
        bucketRegion: ""
      gcs:
        bucket: "airbyte-logs-prod"
        credentials: "/secrets/gcs-log-creds/airbyte-logs-sa.json"
    Error
    Copy code
    ======= service endpoint: <http://airbyte-minio:9000>
    ======= this is new
    Registering Google Cloud Storage publish helper -> com.van.logging.gcp.CloudStorageConfiguration@e6054d9
    Collecting content into /tmp/toBePublished4193778332960186391.tmp before uploading.
    Collecting content into /tmp/toBePublished9123473825388703691.tmp before uploading.
    com.google.cloud.storage.StorageException: Anonymous caller does not have storage.buckets.get access to the Google Cloud Storage bucket.
    	at com.google.cloud.storage.spi.v1.HttpStorageRpc.translate(HttpStorageRpc.java:233)
    	at com.google.cloud.storage.spi.v1.HttpStorageRpc.get(HttpStorageRpc.java:425)
    	at com.google.cloud.storage.StorageImpl.lambda$get$4(StorageImpl.java:264)
    	at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)
    	at com.google.cloud.RetryHelper.run(RetryHelper.java:76)
    	at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)
    	at com.google.cloud.storage.Retrying.run(Retrying.java:51)
    	at com.google.cloud.storage.StorageImpl.run(StorageImpl.java:1374)
    	at com.google.cloud.storage.StorageImpl.get(StorageImpl.java:263)
    	at com.van.logging.gcp.CloudStoragePublishHelper.publishFile(CloudStoragePublishHelper.java:38)
    	at com.van.logging.AbstractFilePublishHelper.end(AbstractFilePublishHelper.java:61)
    	at com.van.logging.BufferPublisher.endPublish(BufferPublisher.java:66)
    	at com.van.logging.LoggingEventCache.publishEventsFromFile(LoggingEventCache.java:190)
    	at com.van.logging.LoggingEventCache.lambda$publishCache$0(LoggingEventCache.java:232)
    	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
    	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    	at java.base/java.lang.Thread.run(Thread.java:833)
    Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 401 Unauthorized
    GET <https://storage.googleapis.com/storage/v1/b/airbyte-logs-prod?projection=full>
    {
      "code" : 401,
      "errors" : [ {
        "domain" : "global",
        "location" : "Authorization",
        "locationType" : "header",
        "message" : "Anonymous caller does not have storage.buckets.get access to the Google Cloud Storage bucket.",
        "reason" : "required"
      } ],
      "message" : "Anonymous caller does not have storage.buckets.get access to the Google Cloud Storage bucket."
    }
    m
    • 2
    • 1
  • u

    윤도경

    04/03/2022, 11:57 AM
    Is this your first time deploying Airbyte: yes OS Version / Instance: M1 MAC Deployment: Docker Airbyte Version: dev of 0.35.62-alpha Source: mongodb-v2 Destination: S3 Description: The destination connection is normal. However, when synchronizing with the connection, the attached error occurs and does not work. Maybe it's a type of json parse error. Give me some any advice?
    • 1
    • 1
  • a

    Ayyoub Maulana Hadidy

    04/04/2022, 4:26 AM
    Hi can i know how airbyte hashing the __airbyte__unique_key ? does airbyte use this ?
    Copy code
    hashlib.sha256(primary-key).hexdigest()
    m
    • 2
    • 2
  • g

    Gujjalapati Raju

    04/04/2022, 4:54 AM
    Hi, Is that working now?
    m
    • 2
    • 1
  • g

    Gujjalapati Raju

    04/04/2022, 6:10 AM
    • adapt the dbt command to use the local git repo How yr dng this?
  • a

    Alban van Rijsewijk

    04/04/2022, 7:22 AM
    Hello everyone, I’m getting an issue and maybe some of you here can help: I’m trying to send my production data from Postgres to snowflake and data are not sent to the right schema in snowflake database. Here is what’s happening: 1. I followed the documentation but data are populated by airbyte in the “public schema” created by default when creating a new database on snowflake instead of the “AIRBYTE_SCHEMA” created for this purpose 2. This only happen for my production data, all other connectors work well and they use the relevant schema I’ve attached the setup of the connection and the resulting tables in “AIRBYTE_DATABASE”. Does anyone have any idea of what I’ve done wrong please ? Appreciate your help
    k
    • 2
    • 12
  • o

    Oluwapelumi Adeosun

    04/04/2022, 11:54 AM
    Is this your first time deploying Airbyte: Yes Deployment: Docker Airbyte Version: 0.35.45-alpha Source: Postgres (0.4.9) Destination: Google Cloud Storage (GCS) (0.1.24) Description: I have been syncing data from Postgres to GCS and it was working fine until 3 days ago. I have tried upgrading and downgrading my connectors. And also resetting and syncing again. But nothing has worked yet.
    m
    • 2
    • 2
1...1011121314Latest