https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • d

    Dustin Pearson

    03/13/2023, 3:21 PM
    https://github.com/airbytehq/airbyte/pull/23968 Could someone trigger the integration tests for this PR? My impression is that they are some sort of snapshot we will need to save the result of, but I won’t have access to do this. Also, happy to modify the PR to meet any criteria I’ve missed
    u
    u
    s
    • 4
    • 11
  • a

    Adam Roderick

    03/13/2023, 3:42 PM
    I don't know what this workflow means in a source connector pull request. Can you help?
  • b

    Brendan Couche

    03/13/2023, 4:25 PM
    Since updating the Postgres connector from 2.0.0 -> 2.0.2, I consistently encounter exceptions during CDC sync
    Copy code
    Failure Origin: source, Message: Something went wrong in the connector. See the logs for more details.
    Rolling back to 2.0.0 allows me to recover. Relevant logs in the thread...
    r
    n
    • 3
    • 15
  • o

    Oliver Broomhall

    03/13/2023, 4:29 PM
    Your email address advertised on your website
    <mailto:team@airbyte.io|team@airbyte.io>
    doesn’t seem to exist, I get the following message when I tried to send an email:
    We’re writing to let you know that the group you tried to contact (team) may not exist, or you may not have permission to post messages to the group
    j
    v
    • 3
    • 5
  • o

    Oliver Broomhall

    03/13/2023, 4:33 PM
    The question I was trying to ask by email was about whether the following flow for Mandrill is possible in a Python CDK. I have played around a bit, but this flow seems to be outside of what’s possible, please let me know if that is the case: 1. POST https://mandrillapp.com/api/1.0/exports/activity a. Input - date_from, date_to b. Output - export id 2. POST https://mandrillapp.com/api/1.0/exports/info a. Input - export id from exports/activity response b. Output - state (in progress | complete), result (null | s3 url) c. Will need to be queried until state = complete and an s3 url is provided 3. GET s3 url a. Downloads a zip file from an s3 location provided from exports/info response 4. Unzip s3 download a. Containing csv files, each with a maximum of 1000 lines b. Csv files will contain data exported between the date_from and date_to inputs from the exports/activity query
  • a

    Andrey Groza

    03/13/2023, 5:27 PM
    Hello. I have a problem. When I start my self hosted airbyte on digitalocean I got the error by airbyte-temporal:
    Copy code
    2023-03-13T16:35:53.779Z    ERROR   Unable to connect to SQL database.      {"error": "pq: no pg_hba.conf entry for host \"167.172.166.155\", user \"doadmin\", database \"temporal_visibility\", no encryption", "logging-call-at": "handler.go:73"}
    To solve it, I added two variables to the docker compose
    Copy code
    - SQL_TLS=true
          - SQL_TLS_DISABLE_HOST_VERIFICATION=true
    The container starts, but still gives an error after a while. How can I solve this problem?
  • w

    Waliur Rahman

    03/13/2023, 6:11 PM
    Hello Folks, I'm new to Airbyte. Deployed on AWS K8s using helm. The worker pod is throwing this error consistently. Any pointer to what might be wrong here and how to resolve?
    Copy code
    io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: namespace count limit exceeded
    begin-airbyte-worker-654fc67b4-n2hg4 airbyte-worker-container 	at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:271) ~[grpc-stub-1.52.1.jar:1.52.1]
    begin-airbyte-worker-654fc67b4-n2hg4 airbyte-worker-container 	at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:252) ~[grpc-stub-1.52.1.jar:1.52.1]
    begin-airbyte-worker-654fc67b4-n2hg4 airbyte-worker-container 	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:165) ~[grpc-stub-1.52.1.jar:1.52.1]
  • s

    Srikanth Sudhindra

    03/13/2023, 6:42 PM
    Are there any connectors that supports reading each table from source db in parallel ?
    n
    • 2
    • 2
  • a

    Andres

    03/13/2023, 7:44 PM
    Hello everyone, I have used Airbyte in a few projects but currently I have a challenge that I have not been able to finish. Is it possible to use some kind of filter in Facebook Marketing? I would like to replicate only the data for campaigns that have more than 1000 impressions, for example
    w
    • 2
    • 1
  • y

    yuan sun

    03/14/2023, 12:58 AM
    I added a new source, Dameng, but an error was reported when using it. What is the situation? who can help me?
    ✅ 1
  • r

    Raj Talukder

    03/14/2023, 5:36 AM
    Hi, I'm new to Airbyte and I'm attempting to deploy it into an EKS cluster through Helm. I was able to successfully deploy it with little issue. Now, since I have this deployment running, I am currently trying to configure some kind of basic authentication to allow for authenticated HTTP requests with the necessary headers to the Airbyte server. However, I'm having difficulty in finding the necessary parameters along with supporting documentation to enable this within the values.yaml. As of right now, is there support for enabling some basic auth layer within the helm deployment? If not, when can we expect that to be updated? Thank you in advance!
    a
    • 2
    • 2
  • g

    Ganpat Agarwal

    03/14/2023, 5:36 AM
    Hello Team We are using Source Amazon-Ads connector in self hosted environment of Airbyte. We received a notification from Amazon regarding APIs depreciation https://advertising.amazon.com/API/docs/en-us/info/deprecations Wanted to check if there is any work in progress to migrate the APIs cc: @David Fagnan
    👀 2
    n
    • 2
    • 1
  • a

    Avi Garg

    03/14/2023, 7:19 AM
    HI All
  • a

    Avi Garg

    03/14/2023, 8:24 AM
    Hi All
  • a

    Avi Garg

    03/14/2023, 8:24 AM
    I am running airbyte in my windows using docker now I want to use OCTAVIA CLI for configurations where to run octavia cli can u plz help me here
    s
    • 2
    • 1
  • s

    Semih Korkmaz

    03/14/2023, 8:52 AM
    Hello on UI builder , decimal for number in default value not allowed Thread in Slack Conversation
    n
    • 2
    • 4
  • y

    Yulong Guan

    03/14/2023, 9:59 AM
    hi all , just a quick question , currently i am using matillion and i am looking at airbyte as a better option , i have got several etl pipeline thats running with python script to transform files from xml/text to csv , how can i do that with airbyte source ?
    n
    • 2
    • 1
  • a

    Avinash

    03/14/2023, 11:29 AM
    Hi This is Avinash We are trying to install in our office server but there is no internet enabled due to security reasons on that machine. Installation is failing with java error failed to fetch remote definitons
  • a

    Avinash

    03/14/2023, 11:30 AM
    How can we get it installed
    👍 1
    s
    • 2
    • 2
  • l

    Leandro Bleda Cantos

    03/14/2023, 11:46 AM
    Hello everyone, What is the best way to refresh the source schema programatically? Our setup: • EC2 + Docker Compose • Airbyte version 0.40.28
    n
    s
    • 3
    • 9
  • l

    laila ribke

    03/14/2023, 12:01 PM
    Hi, I´m trying to connect the Facebook Pages source. I for sure have long lived access token. I keep getting failure. "non json response". Any ideas? which version should I use?
  • v

    Vikrant Aggarwal

    03/14/2023, 12:39 PM
    While trying to deploy airbyte on k8s .. Health probe is failing. I see that api/v1/health is not present inside the image.
    Copy code
    image: airbyte/webapp:0.41.0
    
    10.30.248.216 - - [14/Mar/2023:12:36:04 +0000] "GET /api/v1/health HTTP/1.1" 404 153 "-" "kube-probe/1.24" "-"
    10.30.248.216 - - [14/Mar/2023:12:36:14 +0000] "GET /api/v1/health HTTP/1.1" 404 153 "-" "kube-probe/1.24" "-"
    2023/03/14 12:36:14 [error] 46#46: *41 open() "/usr/share/nginx/html/api/v1/health" failed (2: No such file or directory), client: xx.xx.xx.216, server: localhost, request: "GET /api/v1/health HTTP/1.1", host: "172.25.243.13:80"
    
    / # ls /usr/share/nginx/html/api
    ls: /usr/share/nginx/html/api: No such file or directory
  • m

    Morgan Pyper

    03/14/2023, 1:06 PM
    Hi, I have Airbyte 0.42.0 deployed on AWS EC2 with Docker, having just attempted to upgrade from 0.40.31. With a fresh .env file the instance starts up just fine, but once I change the database variables (DATABASE_USER etc.) to the postgres database we use to store connections etc. the Connections, Sources and Destinations pages in the web UI no longer load (“Oops! Something went wrong… / Unknown error occurred”). I can see all our sources and destinations in the Settings, so it’s clearly connecting to the database in some fashion, but right now it means our entire ELT process is dead. Could anyone help?
  • b

    Bruno Agresta González

    03/14/2023, 1:36 PM
    Hi All, I have Airbyte V0.41.0 with the connector ˝Iterable V 0.1.26" . The last successfully sync was on 02/08 after this day apear an Error in the sync process. I tried to recreate the connection, upgrade the connector, but i have no idea how to solve it. Could anyone help? Thank you!
  • v

    Victor Bruno Castro

    03/14/2023, 1:52 PM
    Hey guys, I'm using Airbyte as our main connector for sources such as postgres, shopify and recurly. I want to know if there is a clear way to drop tmp tables created for airbyte. Those tables make really difficult to navegate throught the tables that we already have, Thank you!
    w
    • 2
    • 4
  • x

    Xavier

    03/14/2023, 2:21 PM
    Hi guys! We are developing a custom connector using the declarative (low code) yaml approach but we are running into an issue. When I run the read operation locally using python I can see the connector reading records and emitting state as expected. However, after building the container and running it through airbyte I get the following error:
    Copy code
    Caused by: java.lang.RuntimeException: No properties node in stream schema
    	at io.airbyte.workers.general.DefaultReplicationWorker.populateStreamToAllFields(DefaultReplicationWorker.java:696)
    The error is descriptive enough to find my issue: I do not have a schema defined for the stream and thus also no properties node can be found inside that schema (strange that it does run locally then though). Nothing changes however when trying to fix the error by adding a .json file in the schemas directory with the name of the file = the name of the stream. I have tried both adding the real schema of the data and a schema which should not correspond to the data, neither have an effect. It still runs fine locally (without showing any warnings or errors about the wrong schema) but not on the airbyte platform (making sure to refresh the schema and that the latest version of the image is present). From reading the documentation I assumed that by defining this file with the right name in the right directory airbyte should find it and use it. I have seen that you can define the schema (or the location of this file) manually in the configured-catalog.json file or in the yaml file itself as well. Should I define it in either of these (and what is the priority between these options)? Thank you! Update: We have gotten it to kinda work by defining the location of the json file manually in the YAML as follows:
    Copy code
    schema_loader:
          type: JsonFileSchemaLoader
          file_path: location of .json file in the schemas directory
    However, the schema defined in that file seems to get ignored. When we try adding a required property that does not exist everything still works.
  • u

    user

    03/14/2023, 3:00 PM
    Hello Praveenraaj K S, it's been a while without an update from us. Are you still having problems or did you find a solution?
  • u

    user

    03/14/2023, 3:04 PM
    Hello Lindsay S, it's been a while without an update from us. Are you still having problems or did you find a solution?
  • a

    Alexsander Lucas Dutra Lima Pereira

    03/14/2023, 4:17 PM
    Hi everyone, I'm new here, and I need some help with Airbyte's helm chart. When I do an upgrade using helm upgrade some strange things happen. 1 - I can't view the logs of previous runs, NoSuchBucket errors occur in the logs. 2 - New syncs stop working with many errors. Do you know what I could be doing wrong?
  • a

    Anchit

    03/14/2023, 6:28 PM
    Hi team, I'm working with the S3 destination connector. Our S3 datalake and IAM roles are configured with IAM roles. A dev can assume these roles and get temporary AWS Access Key, Secret and a Session Token. However, the current S3 connector only takes
    Access Key ID
    and
    Secret Access Key
    , and not
    Session Token
    . How do we implement this connector with our IAM roles config. Would also like to know if you'd recommend IAM Roles or IAM Users? Is there any guidance on this? Thanks!
    • 1
    • 1
1...161162163...245Latest