https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • j

    JPG

    10/04/2022, 8:26 PM
    Is there a difference between using the POST /v1/web_backend/connections/update and the regular POST /v1/connections/update? I mean besides the skipReset functionality.
    ✍️ 1
    s
    • 2
    • 2
  • d

    Dipti Bijpuria

    10/04/2022, 8:56 PM
    Hi all. I have built a custom source connector with set of parameters for my requirement . One of the parameters from above set needs to be updated before every run. I can see we can use /v1/sources/update to update this parameter. But this request requires entire "*connectionConfiguration*" associated with the source to be sent over. That also includes password which is one of the parameters. Is there a way I could avoid sending entire set and only send the parameter which needs to be updated?
    ✍️ 1
    ✅ 1
    m
    • 2
    • 2
  • c

    claudio viera

    10/04/2022, 9:07 PM
    hello, how can work airbyte with workload identity GCP ?
    ✍️ 1
    s
    • 2
    • 2
  • r

    Rocky Appiah

    10/04/2022, 9:35 PM
    Does airbyte offer a paid support model at all? We’ve been unable to get resolution when using the self-hosted docker version (post here)
    u
    • 2
    • 2
  • f

    Fabrício Lima

    10/04/2022, 9:40 PM
    Hello, I need some help creating a Redshift destination on Airbyte. Using 'Standard' uploading method it worked, but when I change to 'S3 Staging' uploading method it creates the bucket but then show me these lines on logs:
    Copy code
    Started testing if IAM user can call listObjects on the destination bucket
    Finished checking for listObjects permission
    Something went wrong in the connector. See the logs for more details.
    Closed all resources for pod
    Error checking connection, status: Optional.empty, exit code: 1
    ✍️ 1
    s
    • 2
    • 7
  • d

    Dmitry Spodarets

    10/04/2022, 10:34 PM
    Hi! I have one problem. Synchronization time from BigQuery to MongoDB is about 5h ( But the size of the BigQuery is 344.65 MB. Is there any way to speed up this process?
    ✍️ 1
    m
    • 2
    • 2
  • r

    Ramesh Shanmugam

    10/04/2022, 11:06 PM
    s3 destination -- seems like change https://github.com/airbytehq/airbyte/pull/15207/files -- is overwritten by https://github.com/airbytehq/airbyte/pull/15296.. things started to break after upgrading s3 connector.
    ✍️ 1
    • 1
    • 2
  • m

    Murat Cetink

    10/05/2022, 12:10 AM
    Hi. I wanted to use custom dbt transformations after a sync using my existing dbt project but got an error that the project.yml can’t be found. I realized that Airbyte looks for the models under its original dbt project (normalization). Is it possible to use my existing dbt project instead of normalization project?
    ✍️ 1
    ✅ 1
    m
    a
    • 3
    • 9
  • m

    Maykon Lopes

    10/05/2022, 12:12 AM
    [AWS Aurora MySQL /rdsdbdata/tmp is full]
    ✍️ 1
    s
    • 2
    • 7
  • r

    Robert Zeydelis

    10/05/2022, 1:48 AM
    Hey, I am wondering if anyone uses Airbyte to extract data from dynamodb and ingest it into BigQuery? I am looking to try out this use case with Airbyte.
    ✍️ 1
    • 1
    • 3
  • k

    Kevin Peters

    10/05/2022, 3:13 AM
    Hi everyone, we got an Airbyte instance spun up on a VM to serve as a test run. After some tinkering, everything is working well (even made a few of our own custom connectors!). We’re ready to migrate to a K8 instance with a standalone Postgres database. Everything is connected and a new instance is set up but we want to transfer the connections and states from the VM to the new K8 instance. Is this possible and what tables should be moved over? Any recommendations methods for doing this?
    ✍️ 2
    c
    m
    +3
    • 6
    • 15
  • c

    Caio César P. Ricciuti

    10/05/2022, 5:49 AM
    Hello All 🙂 quick question I have set up a
    BigQuery
    destination and set a default
    Dataset
    . Now I'm creating
    Google Search Console
    sources that I want them on the same project but on diferente
    Datasets
    ... Something like: GoogleSearchConsoleConnector1 <> BigQueryDestinationMyProject The problem I'm facing is that I can't setup other
    dataset
    then the
    Default Dataset
    I pointed on the Destination Configuration. I really need to create two Destinations in order to use deferents
    Datasets
    or I'm missing something?
    ✍️ 1
    • 1
    • 3
  • d

    Dakota McCarty

    10/05/2022, 6:34 AM
    Hello, a bit of a newb question... but for S3 destinations, what's the best practice for mapping folders? So far I've had to create a new destination for each folder I am directing my data to. This is inefficient for obvious reasons (especially when we go to rotate our AWS keys 😅). I haven't been able to find any solution here or online.
    ✍️ 1
    m
    m
    • 3
    • 3
  • l

    lucien

    10/05/2022, 8:05 AM
    Hi any idea about this ticket ? https://discuss.airbyte.io/t/error-while-processing-event-at-offset/2793/2
    u
    j
    • 3
    • 2
  • s

    Stratos Giouldasis

    10/05/2022, 8:07 AM
    Hello - just installed Airbyte yesterday using DigitalOcean one-click droplet install and tried to connect my DigitalOcean Managed Mongo as a source. It kept timing out (I allowed connections from droplet) I searched all over the internet and found out it’s not supported in Airbyte: https://github.com/airbytehq/airbyte/issues/13602 Is that it? no solution basically? I searched this slack channel and found people are using mongo with airbyte
    ✍️ 1
    s
    • 2
    • 14
  • h

    Haithem (WOOP)

    10/05/2022, 8:50 AM
    Hey, Airbyte switch automatically Stream Destination Namespace to Mirror source structure!! any idea why that happening?
    Copy code
    Airbyte version: 0.40.9
    Source: Postgres - 1.0.11
    Destination: BigQuery - 1.2.4
    ✍️ 1
    m
    • 2
    • 2
  • e

    Emilja Dankevičiūtė

    10/05/2022, 10:24 AM
    hello, rather new to airbyte. In the chat I've read that helm charts are being developed at the moment? Maybe you guys have an eta when they'll be production ready? Or maybe there's a public backlog of sorts?
    ✍️ 1
    s
    • 2
    • 6
  • a

    Anas El Mhamdi

    10/05/2022, 11:42 AM
    Hey yall👋🏽 , Dunno if you can help me out here, I have a fully functioning local connector andbut i keep getting the following error :
    java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "com.fasterxml.jackson.databind.node.ObjectNode.put(String, String)" because "matchingSchema" is null
    From my research it seems to be some kind of namesepace issue but I can’t seem to be able to find more info on the error.
    ✍️ 1
  • a

    Alexandre Chouraki

    10/05/2022, 12:19 PM
    hello ! Airbyte exchange rate connector seems to have been failing continuously since Sept. 15... I thought it was an issue on my end, but even after upgrading to basic plan and trying to recreate the source it's still not working. Connections tests are failing with
    The connection tests failed.
    AttributeError("'NoneType' object has no attribute 'get'")
    Any idea what might be wrong? Anyone that makes it work?
    ✍️ 1
    s
    • 2
    • 3
  • j

    Julian Felix Rost

    10/05/2022, 1:41 PM
    i'm trying to understand airbyte cloud pricing. i've set up a k8s installation of airbyte, which pumps 33194 rows into bigquery (raw) - which turns into 18736285 rows of "processed" bigquery table rows. which number should i use for airbyte cloud cost estimation? one gets me 60$pa, whereas the other lands at 27k$pa 😅
    ✍️ 1
    m
    w
    • 3
    • 5
  • g

    Gabriel Souza

    10/05/2022, 3:42 PM
    Hi guys how are you? I got a running a connection in airbytes postgres to bigquery. This process is not fail and I can't cancel. Is there a way to stop this process by API? Or I need to stop server?
    ✍️ 1
    • 1
    • 2
  • a

    Alvin Khaled

    10/05/2022, 3:50 PM
    Is it possible to maintain an index in a Postgres destination table (from data coming from Redshift)?
    ✍️ 1
    ✅ 1
    👀 1
    m
    • 2
    • 3
  • d

    Dusty Shapiro

    10/05/2022, 5:41 PM
    Helm related deploy question: Is
    airbyte-airbyte-server-svc
    the name of the server svc in the newer charts?
    ✅ 1
    ✍️ 1
    k
    • 2
    • 6
  • c

    claudio viera

    10/05/2022, 6:18 PM
    estoy levantando por docker-composer airbyte estoy tratando de configurar una source sftp pero obtengo el error The connection tests failed. Could not connect to the server with provided configuration. com.jcraft.jsch.JSchException: timeout: socket is not established y yo puedo conectar por filezilla https://dlptest.com/ftp-test/ FTP URL: ftp.dlptest.com or ftp://ftp.dlptest.com/ FTP User: dlpuser Password: rNrKYTX9g7z3RgJRmxWuGHbeu port: 21 the ftp is public to test https://dlptest.com/ftp-test/
    ✍️ 1
    m
    e
    n
    • 4
    • 12
  • d

    David Zajac

    10/05/2022, 6:24 PM
    Hello I spun up AirByte version 0.40.11 on EC2 exactly per the instructions here: https://docs.airbyte.com/deploying-airbyte/on-aws-ec2/ and added a PlanetScale (see docs here: https://planetscale.com/docs/integrations/airbyte) Custom Connector then tried to add my PlanetScale Database as a new source, but it stayed on "Please wait a little bit more..." for an hour. I tried it again with incorrect credentials and it did the same thing, thus it is probably not trying to make a connection in the first place. Could switching to an Alpha version help, any advice greatly appreciated
    ✍️ 1
    u
    m
    • 3
    • 4
  • n

    Nathan Gille

    10/05/2022, 6:44 PM
    Why does the Redshift destination require a user with create on database permissions? Is there anyway around this?
    ✍️ 1
    m
    • 2
    • 5
  • r

    Robert Put

    10/05/2022, 6:46 PM
    Is there anyway to help contribute to: https://github.com/airbytehq/airbyte/issues/4847
    ✍️ 1
    • 1
    • 8
  • j

    Jamil B

    10/05/2022, 7:58 PM
    Hello, I'm experiencing some strange behavior, it all says PASS, but the sync history is showing as failed - I just updated MariaDB and Snowflake connectors
    ✍️ 1
    u
    • 2
    • 7
  • c

    Chris Nogradi

    10/03/2022, 5:23 PM
    I am trying to get Airbyte working with a private docker registry. I was able to update the helm charts for all the images and get this working. I was able to update the JOB_KUBE_* environment variables to use the private registry for busybox/curl/socat. However, the only way I have found to fix the source/destination image locations is to overwrite the actor_definitions table in the database. This works for syncs that don't have normalization enable, however it seems that the image name is used as the lookup to check what type of normalization to use and this fails when the image names are overwritten in the database. All this seems kind of hacky and does not yet fully work so I am trying to just use the imageRegistry setting in the global area of the helm chart (thus maintaining the same image paths as in docker.io) but this does not seem to disseminate anywhere (tried for main worker/webapp/server images and busybox/curl/socat) especially down into the source and destination image locations. What are my options to get this to work other than forking the Java code?
    ✍️ 2
    d
    m
    • 3
    • 6
  • a

    Akul Goel

    10/06/2022, 2:12 AM
    I am experiencing very slow performance on the freshdesk connector for airbyte while importing tickets data. Has anyone else faced the same? Is there a way to mitigate it? As an example, it took me over 5 hours to import 20000 tickets
    ✍️ 1
    e
    s
    +3
    • 6
    • 17
1...707172...245Latest