https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • s

    Samad Husain

    10/20/2022, 3:20 PM
    Does the BigQuery destination using the GCS Staging loading method require a public GCS bucket?
    ✍️ 1
    h
    • 2
    • 2
  • m

    Murat Cetink

    10/20/2022, 3:25 PM
    I’m getting
    failed to fetch schema
    error after upgrading to v0.40.15. My source connector is Zuora 0.1.3
    ✍️ 1
    s
    • 2
    • 5
  • h

    Hemalatha Tahapa

    10/20/2022, 3:30 PM
    Hi Everyone, Can i know when does the Alpha source connector like Microsoft dynamics GP be public We have a strong requirement to migrate the data from Microsoft Dynamics GP to snowflake..... or is there any other method?? Thanks waiting for the reply.
    ✍️ 1
    • 1
    • 4
  • f

    Francisco Viera

    10/20/2022, 3:48 PM
    Additional Failure Information: java.lang.RuntimeException: java.lang.RuntimeException: com.google.cloud.bigquery.BigQueryException: Cannot query over table 'trf_jm_cdp_dev.trf_tbl_cust_id_dict_cdp_dev' without a filter over column(s) '_PARTITION_LOAD_TIME', '_PARTITIONDATE', '_PARTITIONTIME' that can be used for partition elimination
    ✍️ 1
  • f

    Francisco Viera

    10/20/2022, 3:48 PM
    source connector bigquery dont work with partitiontime?
  • b

    Brandon

    10/20/2022, 7:21 PM
    Hi Everyone, Very new to Airbyte, but I really like it so far. I am looking to pull some data from an AWS MySQL database into Snowflake. I have the Connection configured and its moving data, but at the end of the process still showing failed. I am not sure the exact error message, but some I have seen while scrolling thru are
    Copy code
    io.airbyte.config.FailureReason@ece7f36[failureOrigin=source,failureType=<null>,internalMessage=io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@64f5e48d[additionalProperties={attemptNumber=0, jobId=28, connector_command=read}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.general.DefaultReplicationWorker$SourceException: Source process exited with non-zero exit code 1
    Copy code
    java.sql.SQLException: The last transaction was aborted due to Seamless Scaling. Please retry.
    Copy code
    i.a.w.g.DefaultReplicationWorker(run):184 - Sync worker failed.
    Anyone familiar with this error, and have an idea of a resolution?
    ✍️ 1
    s
    • 2
    • 7
  • n

    Nicolas

    10/20/2022, 7:54 PM
    Hello Guys, i been following airbyte for a long time now, and it sure has walked a long mile from my last attempt to use the tool; i have an azure deployment where i am running Airbyte as suggested in the documentation; on a 16vCPU 64GB machine My attempt is to sync 2 MSSQL databases ; the sync itself is working like a charm, i have it set for dedup+history which is my current need as the tables can be updated but they are too big for a full refresh ; but ; and there is always a but, its bothering me the amount of time it takes , In the log summary it says 1.300 lines were sync but it took over 1 hour to do it . Is this an expected behaviour? Am i doing something wrong or there is something i should be paying attention to and i am not? neither of the databases seems to be on any sort of stress , but the normalization phase is taking ages to complete. Any help is appreciated and thanks again for the awesome tool you guys are working on
    ✍️ 1
    • 1
    • 6
  • r

    Robert Put

    10/20/2022, 8:37 PM
    using the cli, is there a way to apply sync changes, without reseting all the streams? Goal is to add new streams for a postgres db, without reseting all the current ones. I need to exclude certain columns(PII), so the only option looks to be cli/api
    ✍️ 1
    a
    s
    • 3
    • 4
  • l

    Lee Danilek

    10/20/2022, 8:48 PM
    hi! i'm getting ready to send a PR for a new connector, and the checklist says
    Copy code
    - [ ] Connector's `bootstrap.md`. See [description and examples](<https://docs.google.com/document/d/1ypdgmwmEHWv-TrO4_YOQ7pAJGVrMp5BOkEVh831N260/edit?usp=sharing>)
    that linked google doc has been deleted, and I don't see anything about bootstrap.md in the docs https://docs.airbyte.com/connector-development/ . I can probably pattern match on examples in the codebase, but it would be nice to know what this is for.
    ✍️ 1
    h
    • 2
    • 4
  • f

    Francisco Viera

    10/20/2022, 10:19 PM
    Hello guys, i need help. I have a connect mssql to bigquery as destination, the process is succesfull but always recreate the tables in bigquery and we need use policy tags, i when apply this policy tag airbyte recreate the tables and delete my policy tag configured
    ✍️ 1
    🙏 1
    s
    • 2
    • 4
  • z

    Zaza Javakhishvili

    10/21/2022, 12:19 AM
    Hi contributors. Please help review and merge my changes. https://github.com/airbytehq/airbyte/pull/18283
    u
    • 2
    • 2
  • r

    Raghu Bharadwaj

    10/21/2022, 5:24 AM
    Hi Everyone, Just getting started with Airbyte. Wanted to setup an incremental sync from Postgres to Kafka. Was successful in setting up the source, destination and the connection according to the documentation provided. Thanks for that. The Postgres tables for which I wanted to setup a Logical Replication CDC are 15 tables from two different schemas inside the same database. I tried syncing them through but there seems to be some errors in the process and I couldn't technically figure out what is it due to. Can somebody help me in this please ?
    Copy code
    Failed
    Last attempt:
    8.56 MB | 29,303 emitted records | no records | 7m 4s | Sync
    Failure Origin: source, Message: Something went wrong in the connector. See the logs for more details.
    2022-10-20 19:51:37 - Additional Failure Information: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped.
    ✍️ 1
    • 1
    • 2
  • j

    Joe.J.Xu

    10/21/2022, 7:44 AM
    Hi Guys, I encountered below errors when I tried to set up a google Ads source. I think this may be a proxy issue, as I deployed airbyte behind a proxy. Similar errors appeared in bing Ads source set-up. I configured http_proxy & https_proxy for docker, but not work. any suggestion on this problem? thanks.
    ✍️ 1
    h
    d
    • 3
    • 22
  • f

    Fabrice Simon

    10/21/2022, 8:20 AM
    Hello, we have been using Airbyte for a few months, we are wondering about the good practices you have on the administration of the connections and the parameterization because we regularly have problems to follow the modifications of a member of the team on the UI. We wonder if we should only use the API to define the connections and follow the modifications via our repo on Gitlab. Do you have any feedback on this usage? What do you think about it ? Thanks
    ✍️ 1
    s
    • 2
    • 5
  • s

    Sebastian Brickel

    10/21/2022, 8:21 AM
    Hey folks, I am developing an Airbyte connector and I was informed that, since I have
    Copy code
    paginator:
          type: NoPagination
    in my connector definition, I will only retrieve data from the first page of the service. What are the options for retrieving all pages of the service? I checked https://docs.airbyte.com/connector-development/config-based/understanding-the-yaml-file/pagination/ but none of the options there are supported by the API
    u
    • 2
    • 1
  • r

    Rahul Borse

    10/21/2022, 8:43 AM
    Hello, why airbyte result has airbyte_emmited_at field in the csv file. What is the use for that. I believe when I want to have incremental sync I can use cursor field on created_at column in my db. I am just trying to understand why we need airbyte_emmited_at field. Can someone please help me to understand
    ✍️ 1
    • 1
    • 2
  • a

    Adrian Castro

    10/21/2022, 8:52 AM
    Hellooo Is anyone interested in developing / has tried to develop an Octavia CLI / Airbyte API Ansible module? I think that with the current state it should be possible.
    octavia thinking 1
    u
    • 2
    • 1
  • r

    Rahul Borse

    10/21/2022, 9:34 AM
    Hello, When I am selecting 2 tables in schema to replicate in destination and launching that connection I am getting below error. io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: grpc: received message larger than max Can someone help me when this error comes and how to prevent it
    ✍️ 1
    • 1
    • 2
  • a

    Angel

    10/21/2022, 11:23 AM
    Hi! if I configure a external Postgres of RDS, I could use a AWS Secret for connect with this RDS? Thanks 🙂
    ✍️ 1
    s
    • 2
    • 5
  • c

    Christopher Teljstedt

    10/21/2022, 11:59 AM
    Hi guys! I am trying to integrate with Azure Synapse, however I cannot get it to work. See image attached. It is essentially the same problem described here: https://discuss.airbyte.io/t/issues-connecting-to-azure-synapse/2720 and I tried to provide the same insight there as here. I pinned down the error to PRIMARY KEY not being compatible in Azure Synapse without using NONCLUSTERED NOT ENFORCED. I just started with Airbytes today and have no experience in contributing to a project. I also created a commit that probably is completely wrong and breaking for other MS SQL integrations. What would be the best way to go about getting support for Azure Synapse? Is there anyone willing to assist helping me getting the MSSQL destination working for Azure Synapse?
    ✍️ 1
    h
    • 2
    • 6
  • h

    Harel Oshri

    10/21/2022, 12:07 PM
    Hello everyone! I’m using airbyte from S3 Source with CSV files to Snowflake. Everything works great but recently we had to change the schema and add 2 more columns. ----- When I’m clicking on the “Refresh source schema” on the UI it takes a very long time and nothing happens (the schema does not refresh). (The refresh never finish). Any suggestions?
    ✍️ 1
    e
    s
    • 3
    • 3
  • a

    Alexander Pospiech

    10/21/2022, 12:48 PM
    Hi everyone, we've got some problems setting up latest release of Airbyte (0.40.16) inside of an EC2 instance running Ubuntu 22.04. The UI is not reachable and after a while I get a "504 Gateway Time-out". Below are some investigations we did. Where should we look next? We tried
    docker-compose down
    and
    docker-compose up -d
    , where the following errors seem to disappear, but still nothing works. The docker images come up nicely.
    Copy code
    $ docker-compose ps
           Name                     Command               State                                        Ports                                    
    --------------------------------------------------------------------------------------------------------------------------------------------
    airbyte-bootloader   /bin/bash -c ${APPLICATION ...   Up                                                                                    
    airbyte-cron         /bin/bash -c ${APPLICATION ...   Up                                                                                    
    airbyte-db           docker-entrypoint.sh postgres    Up       5432/tcp                                                                     
    airbyte-proxy        ./run.sh ./run.sh                Up       80/tcp, 0.0.0.0:8000->8000/tcp,:::8000->8000/tcp,                            
                                                                   0.0.0.0:8001->8001/tcp,:::8001->8001/tcp                                     
    airbyte-server       /bin/bash -c ${APPLICATION ...   Up       8000/tcp, 0.0.0.0:49316->8001/tcp,:::49316->8001/tcp                         
    airbyte-temporal     ./update-and-start-temporal.sh   Up       6933/tcp, 6934/tcp, 6935/tcp, 6939/tcp, 7233/tcp, 7234/tcp, 7235/tcp,        
                                                                   7239/tcp                                                                     
    airbyte-webapp       /docker-entrypoint.sh ngin ...   Up       0.0.0.0:49317->80/tcp,:::49317->80/tcp                                       
    airbyte-worker       /bin/bash -c ${APPLICATION ...   Up       0.0.0.0:49668->9000/tcp,:::49668->9000/tcp                                   
    init                 /bin/sh -c ./scripts/creat ...   Exit 0
    There are PSQLExceptions in airbyte-worker and airbyte-cron.
    Copy code
    $ docker logs airbyte-worker
    2022-10-21 11:09:56,409 main INFO Loading mask data from '/seed/specs_secrets_mask.yaml
    
        ___    _      __          __
       /   |  (_)____/ /_  __  __/ /____
      / /| | / / ___/ __ \/ / / / __/ _ \
     / ___ |/ / /  / /_/ / /_/ / /_/  __/
    /_/  |_/_/_/  /_.___/\__, /\__/\___/
                        /____/
            : airbyte-workers :
    
      Micronaut (v3.7.2)
    
    2022-10-21 11:09:58 ESC[32mINFOESC[m i.m.c.e.DefaultEnvironment(<init>):159 - Established active environments: [ec2, cloud, control-plane]
    2022-10-21 11:09:58 ESC[32mINFOESC[m c.z.h.HikariDataSource(<init>):71 - HikariPool-1 - Starting...
    2022-10-21 11:10:09 ESC[1;31mERRORESC[m c.z.h.p.HikariPool(throwPoolInitializationException):526 - HikariPool-1 - Exception during pool initialization.
    org.postgresql.util.PSQLException: The connection attempt failed.
            at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:331) ~[postgresql-42.5.0.jar:42.5.0]
           ...
    Caused by: java.net.SocketTimeoutException: Connect timed out
            ....
    airbyte-server just shows this message at the end of the log:
    Copy code
    $ docker logs airbyte-server
    2022-10-21 11:09:57 ESC[33mWARNESC[m i.a.d.c.DatabaseAvailabilityCheck(check):38 - Waiting for database to become available...
    2022-10-21 11:09:57 ESC[32mINFOESC[m i.a.d.c.DatabaseAvailabilityCheck(lambda$isDatabaseConnected$1):75 - Testing airbyte configs database connection...
    airbyte-db does not show any errors airbyte-proxy shows some timeouts:
    Copy code
    1 upstream timed out (110: Connection timed out) while connecting to upstream
    airbyte-temporal
    Copy code
    $ docker logs airbyte-temporal
    Start init
    Done init
    Waiting for PostgreSQL to startup.
    The database in airbyte-db looks empty 🤔
    Copy code
    $ psql -h localhost -p 5432 -d airbyte -U docker
    Password for user docker: 
    psql (14.5 (Ubuntu 14.5-0ubuntu0.22.04.1), server 13.8)
    Type "help" for help.
    
    airbyte=# \d
    Did not find any relations.
    ✍️ 1
    e
    s
    +2
    • 5
    • 7
  • h

    Hiep Minh Pham

    10/21/2022, 1:08 PM
    Hi, I am having an issue with setting up a custom report for Google Search Console. I followed the instruction in this documentation page from Airbyte. The parameters I used for the custom report are:
    {'name': 'page_query', 'dimensions': ['page', 'query']}
    but I get an errors when I try to setup a connection. Please note that it works fine without the custom report. Do you have any ideas? Appreciate your help!
    ✍️ 1
    ✅ 1
    w
    f
    n
    • 4
    • 15
  • f

    Francisco Viera

    10/21/2022, 2:07 PM
    i need use policy in schema bigquery, airbyte recreate the tables always
    ✍️ 1
    u
    m
    • 3
    • 25
  • b

    Blake Hughes

    10/21/2022, 2:41 PM
    Hi. I’m trying to use the dagster-airbyte integration and am getting this error: Request to Airbyte API failed: 401 Client Error: Unauthorized for url: http://localhost:8000/api/v1/connections/get This is the code that’s failing. Any help to resolve the error would be greatly appreciated:
    my_airbyte_resource = airbyte_resource.configured(
    {
    "host": "localhost",
    "port": "8000",
    }
    )
    sync_foobar = airbyte_sync_op.configured({"connection_id": "8bb7ae88-350d-40f4-947c-dfbe682311af"}, name="sync_foobar")
    ✍️ 1
    a
    b
    e
    • 4
    • 5
  • k

    Kevin Phan

    10/21/2022, 3:16 PM
    Hi folks! QQ, when setting up S3 as a destination, is there any docs on how to use IRSA instead of the basic Auth: AWS Key and ID ? We need to use IAM roles or in this case since Airbyte is deployed to EKS: IRSA. Also having this same problem with setting up Airbyte logging bucket for S3. I am running on
    0.40.0-alpha
    ✍️ 1
    s
    • 2
    • 3
  • g

    Guy Feldman

    10/21/2022, 4:35 PM
    trying to upgrade to airbyte 0.40.16 and the syncs work just fine, but when I run a reset I get
    Copy code
    Cannot invoke "io.airbyte.config.JobSyncConfig.getSourceDockerImage()" because "jobSyncConfig" is null
    has anyone run into this?
    ✍️ 1
    • 1
    • 6
  • s

    Slackbot

    10/21/2022, 6:35 PM
    This message was deleted.
    ✍️ 1
    • 1
    • 2
  • y

    Yoan Yahemdi

    10/21/2022, 7:23 PM
    Hello guys, after this connect Airbyte step, when I reach localhost, they ask for admin and password which I haven’t setup before? Can someone help me?
    ✍️ 1
    u
    • 2
    • 3
  • k

    Kevin Phan

    10/21/2022, 7:45 PM
    i am getting this error on making new sources:
    Copy code
    Invalid json input. Unrecognized field "workspaceId" (class io.airbyte.api.model.generated.SourceCoreConfig), not marked as ignorable (2 known properties: "connectionConfiguration", "sourceDefinitionId"]) at [Source: (org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream); line: 1, column: 257] (through reference chain: io.airbyte.api.model.generated.SourceCoreConfig["workspaceId"]) Unrecognized field "workspaceId" (class io.airbyte.api.model.generated.SourceCoreConfig), not marked as ignorable
    i also updated my airbyte version to
    0.40.15
    . what is the latest stable version i can use? I saw threads that mention this is a backend issue
    ✍️ 1
    s
    • 2
    • 10
1...798081...245Latest