https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • o

    Octavia Squidington III

    06/19/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1pm PDT click here to join us on Zoom!
  • m

    Matheus Barbosa

    06/19/2023, 9:13 PM
    Please someone help me, our company is having a big problem when trying to sync Google Ads data with ClickHouse and these errors related to Nested columns having different array sizes is always showing up! How can we avoid that and successfully sync this data with ClickHouse?
    Copy code
    20:17:09.821209 [error] [MainThread]: Database Error in model AW_campaigns_scd (models/generated/airbyte_incremental/scd/metrito_airbyte/AW_campaigns_scd.sql)
    20:17:09.821596 [error] [MainThread]:   :HTTPDriver for <https://rtswk5h81k.us-east-2.aws.clickhouse.cloud:8443> returned response code 500)
    20:17:09.821870 [error] [MainThread]:    Code: 190. DB::Exception: Elements 'campaign.excluded_parent_asset_field_types' and 'campaign.targeting_s__g.target_restrictions' of Nested data structure 'campaign' (Array columns) have different array sizes. (SIZES_OF_ARRAYS_DONT_MATCH) (
    20:17:09.822422 [error] [MainThread]: Database Error in model AW_display_topics_performance_report (models/generated/airbyte_tables/metrito_airbyte/AW_display_topics_performance_report.sql)
    20:17:09.822815 [error] [MainThread]:   :HTTPDriver for <https://rtswk5h81k.us-east-2.aws.clickhouse.cloud:8443> returned response code 500)
    20:17:09.823084 [error] [MainThread]:    Code: 190. DB::Exception: Elements 'ad_group_criterion.final_urls' and 'ad_group_criterion.topic.path' of Nested data structure 'ad_group_criterion' (Array columns) have different array sizes. (SIZES_OF_ARRAYS_DONT_MATCH) (version 23.5.1.34
    20:17:09.823501 [error] [MainThread]:   compiled Code at ../build/run/airbyte_utils/models/generated/airbyte_tables/metrito_airbyte/AW_display_topics_performance_report.sql,retryable=<null>,timestamp=1686946637696,additionalProperties={}]],additionalProperties={}]
    These are part our logs
    k
    • 2
    • 2
  • n

    nanggardev

    06/20/2023, 3:27 AM
    Hi does materliazed view on postgree still not solved yet ? I use airbyte version 0.40.32 but materialized view didn’t show up.
    k
    • 2
    • 2
  • l

    Lenin Mishra

    06/20/2023, 6:42 AM
    Hello everyone! I am constantly having issues trying to read the invoices endpoint from Zoho Books. I am building it using the Low CDK configuration. The setup is very simple and I am trying to authenticate the endpoint using Oauth. All the necessary configs are provided(Please refer the image attached)! However I keep getting the below error
    Copy code
    File "/root/.pyenv/versions/3.9.11/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py", line 33, in get_auth_header
        return {"Authorization": f"Bearer {self.get_access_token()}"}
      File "/root/.pyenv/versions/3.9.11/lib/python3.9/site-packages/airbyte_cdk/sources/streams/http/requests_native_auth/abstract_oauth.py", line 38, in get_access_token
        token, expires_in = self.refresh_access_token()
      File "/root/.pyenv/versions/3.9.11/lib/python3.9/site-packages/airbyte_cdk/sources/declarative/auth/oauth.py", line 116, in refresh_access_token
        return response_json[self.get_access_token_name()], response_json[self.get_expires_in_name()]
    KeyError: 'access_token'
    Now, when I run the above query in Postman I get all my results. Seems like something is going wrong with generation of new access token. For some reason, the token_refresh_endpoint is not returning a json with the access_token key. can someone help me with it?
  • n

    Nikolai Nergård

    06/20/2023, 8:58 AM
    Hi! When attempting to upgrade a source connector in an Airbyte OSS Helm installation, I get a response with
    upstream request timeout
    with a 504 status. The endpoint failing is
    /api/v1/source_definitions/update
    . The weird thing is, if I wait about 1 minute and click again, it works. However it fails the first time for every connector with the same response. I also get this response from the
    /api/v1/scheduler/sources/check_connection
    endpoint.
    k
    • 2
    • 8
  • s

    Sebastien HO

    06/20/2023, 10:15 AM
    Hi, not sure it is the right place to post this error: I have pod workers that failed to init due to timeout issue, then process is retried and works. My concerns is that the failing pod stays in 'InitError' state and it is not terminated. Shouldn't it be deleted by the sweeper?
    k
    • 2
    • 2
  • h

    Hiroto Yamakawa

    06/20/2023, 11:04 AM
    Hi, has anyone ever experienced missing rows with the connector Salesforce? We have 2 environments (dev and production) = 2 distincts EC2 instances. Same Salesforce Parameters, used a start date of 1971-01-01 to pull everything. The row appears and is ingested in Dev, but not in Production. The value doesn't appear in
    _AIRBYTE_ROW_TABLE
    itself. I've deleted, created the source and connection from zero in production, reseted everything. Still, I miss like 5 rows out of 55k rows. But cannot tell why.
    octavia thinking 1
    🤔 1
    k
    r
    • 3
    • 5
  • o

    Octavia Squidington III

    06/20/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom!
  • b

    Bryce Arden

    06/20/2023, 2:30 PM
    Hey folks, I’m using localstack for local development, and I was curious if anyone has successfully connected a local s3 bucket using the s3 destination?
    k
    • 2
    • 3
  • a

    Aman Kesharwani

    06/20/2023, 3:27 PM
    Hi, I am trying to create a connection between Source S3 and destination Redshift, In the source S3 setup it is only allowing me to create single stream for multiple parquet file.. I want to create multiple streams under the same source connection, Is there any way we can achieve this ?
    k
    t
    • 3
    • 4
  • s

    Srikanth Sudhindra

    06/20/2023, 6:12 PM
    Hi, trying to upgrade to helm chart OSS version 0.45.38 on EKS Cluster, running into the following error -
    2023-06-20 18:03:55 ERROR o.f.c.i.l.s.Slf4jLog(error):57 - Migration of schema "public" to version "0.50.1.001 - NotificationSettingsBackfill" failed! Changes successfully rolled back.
    2023-06-20 18:03:55 ERROR i.a.b.Application(main):25 - Unable to bootstrap Airbyte environment.
    Anyone encountered this ?
    k
    r
    • 3
    • 8
  • j

    james truty

    06/20/2023, 8:31 PM
    is there a way to update the connector version for all workspaces in a deployment? Say I want to update all workspaces immediately to a bugfix source-hubspot version?
    k
    • 2
    • 2
  • m

    Madison Mae

    06/20/2023, 8:37 PM
    I added these tables to a connection that was previously working, something with adding them caused an error, and now I can't remove them and revert back to the original connection. Any ideas? Not sure what the red lines on the left mean.
    m
    • 2
    • 2
  • r

    Ryan Roline

    06/20/2023, 9:33 PM
    Hello - does anyone know if the source connector for MSSQL runs queries using READ UNCOMMITED / NOLOCK by default? If not, is there a place in the config where I could specify this?
    k
    • 2
    • 2
  • a

    Akilesh V

    06/21/2023, 6:47 AM
    Hello ALL, is webhook always includes
    text
    field in the body?
    k
    • 2
    • 2
  • i

    Ignacio Valdelvira

    06/21/2023, 10:34 AM
    Hi! I tried updating the source
    postgres
    from 1.0.45 to 2.0.34 and got the error
    java.lang.IllegalStateException: Get Spec job failed
    When running
    docker run --rm airbyte/source-postgres:2.0.34 check
    check I get the following:
    Copy code
    {
      "type": "LOG",
      "log": {
        "level": "ERROR",
        "message": "ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. java.lang.IllegalArgumentException: org.apache.commons.cli.MissingArgumentException: Missing argument for option: config\n\tat io.airbyte.commons.cli.Clis.parse(Clis.java:34) ~[io.airbyte-airbyte-commons-cli-20.10.23.jar:?]\n\tat io.airbyte.commons.cli.Clis.parse(Clis.java:39) ~[io.airbyte-airbyte-commons-cli-20.10.23.jar:?]\n\tat io.airbyte.integrations.base.IntegrationCliParser.parseOptions(IntegrationCliParser.java:120) ~[io.airbyte.airbyte-integrations.bases-base-java-20.10.23.jar:?]\n\tat io.airbyte.integrations.base.IntegrationCliParser.parse(IntegrationCliParser.java:61) ~[io.airbyte.airbyte-integrations.bases-base-java-20.10.23.jar:?]\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99) ~[io.airbyte.airbyte-integrations.bases-base-java-20.10.23.jar:?]\n\tat io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:380) ~[io.airbyte.airbyte-integrations.connectors-destination-bigquery-20.10.23.jar:?]\nCaused by: org.apache.commons.cli.MissingArgumentException: Missing argument for option: config\n\tat org.apache.commons.cli.DefaultParser.checkRequiredArgs(DefaultParser.java:211) ~[commons-cli-1.4.jar:1.4]\n\tat org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:125) ~[commons-cli-1.4.jar:1.4]\n\tat org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:76) ~[commons-cli-1.4.jar:1.4]\n\tat org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:60) ~[commons-cli-1.4.jar:1.4]\n\tat io.airbyte.commons.cli.Clis.parse(Clis.java:29) ~[io.airbyte-airbyte-commons-cli-20.10.23.jar:?]\n\t... 5 more\n",
        "stack_trace": "java.lang.IllegalArgumentException: org.apache.commons.cli.MissingArgumentException: Missing argument for option: config\n\tat io.airbyte.commons.cli.Clis.parse(Clis.java:34)\n\tat io.airbyte.commons.cli.Clis.parse(Clis.java:39)\n\tat io.airbyte.integrations.base.IntegrationCliParser.parseOptions(IntegrationCliParser.java:120)\n\tat io.airbyte.integrations.base.IntegrationCliParser.parse(IntegrationCliParser.java:61)\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:99)\n\tat io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:380)\nCaused by: org.apache.commons.cli.MissingArgumentException: Missing argument for option: config\n\tat org.apache.commons.cli.DefaultParser.checkRequiredArgs(DefaultParser.java:211)\n\tat org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:125)\n\tat org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:76)\n\tat org.apache.commons.cli.DefaultParser.parse(DefaultParser.java:60)\n\tat io.airbyte.commons.cli.Clis.parse(Clis.java:29)\n\t... 5 more\n"
      }
    }
    My Airbyte version is 0.42.1. Any ideas what could be the issue here or what should I try?
    k
    • 2
    • 2
  • h

    Haim Beyhan

    06/21/2023, 11:40 AM
    I am running latest Airbyte helm chart on AWS EKS. We have source Mysql and destination Bigquery. When a change is done on the ui in the replication and saved, the server disconnects and we're getting error 502. I see at the same time server pod restarted. This happens quite few times while working. The worker pod also started one time during process. Do yo have any suggestions, best practices ?
    i
    • 2
    • 5
  • a

    aidan

    06/21/2023, 11:47 AM
    Hello I am loading data using the zendesk support connector to snowflake. All streams work as expected other than ticket metrics which is the key stream for the entire connection. The metrics stream works on initial run but then seems to hang after the first 5000 are returned in the incremental run . This has been on going for the last number of weeks. I have updated to the latest version on zendesk which updated the stream to use cursor pagination. Has anyone come across this issue previously and is there a known work around fix I can implement. Regards Aidan
    k
    • 2
    • 3
  • s

    srivani karade

    06/21/2023, 11:47 AM
    Hey all, I just have a small doubt regarding building a custom connector on Airbyte. Can we use just the connector after we build it ?
    k
    a
    • 3
    • 11
  • o

    Ohad Feiner

    06/21/2023, 12:09 PM
    Hi guys, I was wondering if I could drop a source field (or few) when first defining my new connection, so it doesn't get created on the destination at all
    k
    • 2
    • 2
  • i

    Ivan Sinitsyn

    06/21/2023, 12:20 PM
    Hello everyone, I have a question about Salesforce connector. I created a connection between salesforce and gbq to sync only 3 tables, but each table has about 10 million records. Sync has been running more than 36 hours already and only 5 records synced from the one table only. I'm wondering, if is there any way to increase the throughput or understand what is the bottleneck? Salesforce source image 2.0.5, GBQ destination image 1.4.4, Airbyte is up and running on the GKE. Thanks!
    k
    • 2
    • 2
  • j

    John Wessel

    06/21/2023, 12:44 PM
    has anyone run into this error while loading data via an incremental load from sql server?
    Copy code
    com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the keyword 'EXISTS'
    k
    • 2
    • 2
  • h

    Haim Beyhan

    06/21/2023, 12:47 PM
    I ran a helm upgrade to current installation and I am getting now Message: Unauthorized! Configured service account doesn't have access. Service account may have been revoked. Unauthorized.
    k
    m
    +3
    • 6
    • 7
  • l

    Luis Felipe Oliveira

    06/21/2023, 1:47 PM
    Hi everyone, I'm having problems with my Airbyte notification. Until before the update it worked normally, but now it doesn't notify me when a successful synchronization occurs. I couldn't create an example that gives an error to see if it won't notify either
    k
    • 2
    • 2
  • j

    Jonathan Linford

    06/21/2023, 2:45 PM
    I updated the source connector for postgres from
    airbyte/source-postgres:2.0.30
    ->
    airbyte/source-postgres:2.0.34
    . And now the connection test fails. Same with destination connector snowflake
    airbyte/destination-snowflake:1.0.4
    ->
    airbyte/destination-snowflake:1.0.5
    k
    j
    • 3
    • 6
  • j

    Jesus Rivero

    06/21/2023, 2:47 PM
    Hi guys, i am facing with some problems with github connector, it fails frencuently with timeout, we talk with github support and tell us that is because max time processing a request is 10 seconds. I would like to know which are the best practices to configure github source to avoid timeout errors.
    k
    • 2
    • 8
  • g

    Gabriel Levine

    06/21/2023, 3:21 PM
    After initial sync, Postgres CDC doesn’t sync any records to BigQuery. Airbyte version 0.50.3. Postgres source version 2.0.34. BigQuery destination version 1.4.4
    k
    • 2
    • 2
  • j

    Joel

    06/21/2023, 7:44 PM
    Hi all. We want to connect Apple search ads to airbyte, and these are the steps provided in the documentation. What we haven't been able to understand is what an API user role means. Is that a user account in Apple Search Ads which has api permissions?
    k
    • 2
    • 11
  • o

    Octavia Squidington III

    06/21/2023, 7:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1PM PDT click here to join us on Zoom!
  • c

    Chidambara Ganapathy

    06/22/2023, 6:56 AM
    Hi Team, I am trying to add Facebook Marketing Source connector. I have entered the Account Id, start date & end date & Access Token. I am getting the below error. Error: 2635, (#2635) Your app has been upgraded to version v17.0, please use this version or newer. This can be verified in the settings tab on the App Dashboard.. Please also verify your Account ID: See the https://www.facebook.com/business/help/1492627900875762 for more information. Any suggestion for the above would be helpful. Thanks
    k
    • 2
    • 6
1...204205206...245Latest