https://linen.dev logo
Join SlackCommunities
Powered by
# ask-community-for-troubleshooting
  • p

    Praveen Singh

    07/06/2022, 11:35 AM
    What's the best way to manage various tables ingestion from same DB while using Postgres as source ? a) Use a new connection for each table b) All tables in same connections In former approach how are we handing if there is a re-sync in schema required for single table out of n tables ?
    m
    • 2
    • 1
  • w

    Wasantha P. Bandara

    07/06/2022, 12:30 PM
    👋 Hello, team!
  • w

    Wasantha P. Bandara

    07/06/2022, 12:31 PM
    Does anyone develop a "campaign-monitors" connector source?
    m
    • 2
    • 3
  • d

    David Oden

    07/06/2022, 6:07 PM
    Hello everyone, please an error while trying to create a source following this article https://docs.airbyte.com/connector-development/tutorials/cdk-speedrun/ (CDK speed run HTTP API source creation) The generate.sh script exits with an error at line 38. I have attached some images of my output Note: 1. I am running on a windows OS 2. The error disappeared so fast I had to screen record to get evidence
    m
    c
    • 3
    • 5
  • z

    Zawar Khan

    07/06/2022, 6:24 PM
    Hi everyone, I have question. I want to take list of ids from one stream and then execute another stream on all those ids. Is there a recursive way to do that?
    m
    c
    • 3
    • 3
  • j

    Jaye Howell

    07/06/2022, 7:38 PM
    Just started with Airbyte, I have the open source version installed and container running locally, but dont see source options for Dynamics Customer Engagement or Dynamics AX? They show on the list of available connectors, but cant tell if they are in the cloud version only or also available on the open source (local) version.
    m
    • 2
    • 3
  • m

    Marcos Marx (Airbyte)

    07/06/2022, 7:50 PM
    Hello octavia wave I’m sending this message to help you know how this channel works. This channel is for basic questions related to Airbyte. Some examples are: Does the Salesforce connector support user region field in User endpoint? Does Airbyte deploys in AWS Container service? If you are facing a deployment or connection issue please use our Discourse forum to get support. Why we ask that? Because in Discourse your issue can be discovered by other users in the future; Discourse has a good integration with Github so you’re going to receive update when the issue is fixed in the project.
  • s

    Saul Feliz

    07/06/2022, 9:10 PM
    hey guys...I'm debugging a Google Ads Connector, and reading the setup guide, I'm wondering if there are any example requests for the connectors and what they should look like?
  • a

    amit zafran

    07/06/2022, 9:15 PM
    Hey guys! I'm new here and want to try contributing to this amazing project! I've opened a new issue for rabbitmq-source. would love if you take a look! 🐰
  • a

    Arjunkumar Krishnankutty

    07/07/2022, 2:11 AM
    Hello everyone, New to airbyte. Interested in mysql cdc and incremental mode of data extraction, inline transformation on certain fields before loading into snowflake destination. Is this something a usecase that airbyte can be used to address?
  • a

    Amit Gupta

    07/07/2022, 9:23 AM
    Hello Team, I’m getting following here when deploying airbyte with helm chart on K8s
    Copy code
    Unknown keyword existingJavaType - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a Non │
    │ 2022-07-07 09:22:10 ERROR i.a.s.ServerApp(main):291 - Server failed
  • s

    Sergei Kapochkin

    07/07/2022, 1:23 PM
    Hi everyone If I have multiple fb ads account id - how could I pass it to connector (except option with multiple fb connectors)?
    m
    • 2
    • 1
  • k

    Kevin Phan

    07/07/2022, 2:06 PM
    Hey Everyone, Is there something wrong with the docs? I am trying to access https://docs.airbyte.com/integrations/contributing-to-airbyte/ to build out a new select start connector for airbyte. Looks like a few pages arent accessible. cc @John (Airbyte) @Greg Solovyev (AirByte) @Harshith (Airbyte) and anyone else. thanks in advance!
    ✅ 1
    a
    a
    j
    • 4
    • 7
  • d

    Daniel

    07/07/2022, 3:24 PM
    Hi everyone! I'm new here using Airbyte and I don't know how Airbyte pulls data from source to destination using Workers and Jobs. I've read these docs but I still don't understand how it works. Could someone give more details about it? • https://docs.airbyte.io/understanding-airbyte/high-level-view • https://docs.airbyte.io/understanding-airbyte/airbyte-specification For some reasons, I had to deploy Airbyte in an on-premise server and I want to move data between Snowflake and a Data Lake, both services are in the same region of AWS. I need to make sure how data is moving around in order to avoid data transfer costs between cloud and on-premise (egress). If Airbyte needs to download data to local and then push it to the cloud again, unfortunately this solution won't fit me. Thank you for having me here and your support! :)
    a
    • 2
    • 2
  • v

    Vishal Jain

    07/07/2022, 5:19 PM
    Hey, If I want to contribute to the open source repo from an already created issue, how do I self-assign? Or does someone from the airbyte team have to do the assignment?
    m
    • 2
    • 4
  • d

    David Beaudway

    07/07/2022, 7:35 PM
    Is the Workspaces feature available when running Airbyte locally? I found some old comments/closed GitHub feature requests but not seeing it available through UI?
    m
    • 2
    • 1
  • k

    Kabilan Ravi

    07/08/2022, 5:07 AM
    Hi Team, I am trying to connect zoho crm with airbyte. After providing all the details, I am getting error in create api
    "Errors: $.start_datetime: 2022-01-01 is an invalid date-time"
    . I have given the date format as
    start_datetime: "2022-01-01"
  • k

    Kabilan Ravi

    07/08/2022, 5:07 AM
    Can someone help please?
    a
    • 2
    • 3
  • m

    Michael

    07/08/2022, 5:28 AM
    Hello everyone, is it possible for me to modify the webhook function to send or do other things? If yes, where and how? Thank you.
    a
    • 2
    • 9
  • a

    Akul Goel

    07/08/2022, 12:42 PM
    Hello, while using google sheets as a source, is there a way for me import a full google directory (which contains multiple google sheets)?
    h
    • 2
    • 3
  • e

    Erik Wickstrom

    07/08/2022, 7:59 PM
    I’m trying to setup Snowflake as a destination, and I keep getting this error when testing the connection:
    Copy code
    Could not connect with provided configuration. Cannot perform CREATE TABLE. This session does not have a current database. Call 'USE DATABASE', or use a qualified name.
    The user/role I am connecting with owns the schema.
    m
    • 2
    • 1
  • a

    Ashwini Mali

    07/09/2022, 5:40 AM
    hello,I'M currently using airbyte(version 39.1) for ingesting table from postgrsql(Version 9.5) to local json on airbyte replication page when i try to search stream name in search bar nothing is there and also try to defined sourc name,sync mode,destination name is not add on that option .how can i solve this problem and which is the airbyte latest version?
    m
    • 2
    • 2
  • m

    Manas Hardas

    07/11/2022, 4:38 AM
    Hi everyone, we are trying out Airbyte internally in my org. One of the use cases is to get app review data from Apple app store and Google play store. I found that Apple app store connector is in alpha but it does not ingest review level data. Also the last PR for this connector was in late last December 2021. Are there any further plans to develop this? Would love any clarity on this. Thanks!
    m
    • 2
    • 1
  • i

    Ismaël Hommani

    07/11/2022, 10:45 AM
    Hi all, I see no way to configure a custom sales force domain in the Salesforce connector. Reading the connector's source, it seems like the login URL is hardcoded. https://github.com/airbytehq/airbyte/blob/94e3e0ea27ab84c13d1283d1078f2f7735285ae5[…]egrations/connectors/source-salesforce/source_salesforce/api.py I have later indeed a connection error. So I guess I would have to deploy my own connector version to test my connection. Does it sound right?
    h
    • 2
    • 2
  • s

    Sergei Kapochkin

    07/11/2022, 3:51 PM
    Hi! I’m using S3 source and set path, credential, pattern all vaild my files are .gz - but every time I tried to make connection I got zero records Maybe there is some hints?
    m
    • 2
    • 2
  • y

    Yifan Sun

    07/11/2022, 9:35 PM
    anyone know where can i get the airbyte version that i use?
    m
    • 2
    • 3
  • s

    suman

    07/12/2022, 6:28 AM
    Hello! I want to connect to gsheets using Airbyte opensource deployed in our infra. do we have to have our own google dev API for doing so?
    m
    • 2
    • 1
  • e

    Ernest POPOVICI

    07/12/2022, 9:06 AM
    Hello, I am trying to configure Airbyte to ingest data from a Postgres database to an S3. The connexions all pass, the only issue I have is Airbyte doesn’t seem to see all the tables. I thought it might be an issue of user or schema that do not have access to all tables, but using a Query tool to connect to the same database with the same user credentials and on the same schema, I can see all the tables that I’m suppose to. I thought it can be a connector bug; also I heard it can be an issue of table authors (even if airbyte has full admin access to the tables, it doesn’t see some tables that its user is not considered the author of), I am really unsure what the issue is. Anybody has some kind of insight as to what the problem might be / how can I solve it?
    m
    • 2
    • 2
  • t

    tharaka prabath

    07/12/2022, 9:12 AM
    Hi, Guys is there any way to add sync limit to Airbyte ?? its mean only sync last 2000 recodes ???
    m
    • 2
    • 4
  • m

    Manel Rhaiem

    07/12/2022, 9:41 AM
    Hello people 👋🏻 I am trying to set up
    kafka
    as a destination locally and I am using our staging kafka confluent cluster, I am using
    SASL_SSL
    protocol as plain mechanism specifying the jaas config which has
    Copy code
    org.apache.kafka.common.security.plain.PlainLoginModule required username="my_user" password="my_password";
    But I am not able to have the connection working and seeing this as failure log
    Copy code
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - org.apache.kafka.common.KafkaException: Failed to construct kafka producer
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:440) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:291) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:274) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.kafka.KafkaDestinationConfig.buildKafkaProducer(KafkaDestinationConfig.java:76) ~[io.airbyte.airbyte-integrations.connectors-destination-kafka-0.39.20-alpha.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.kafka.KafkaDestinationConfig.<init>(KafkaDestinationConfig.java:32) ~[io.airbyte.airbyte-integrations.connectors-destination-kafka-0.39.20-alpha.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.kafka.KafkaDestinationConfig.getKafkaDestinationConfig(KafkaDestinationConfig.java:38) ~[io.airbyte.airbyte-integrations.connectors-destination-kafka-0.39.20-alpha.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.kafka.KafkaDestination.check(KafkaDestination.java:48) [io.airbyte.airbyte-integrations.connectors-destination-kafka-0.39.20-alpha.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:140) [io.airbyte.airbyte-integrations.bases-base-java-0.39.20-alpha.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:107) [io.airbyte.airbyte-integrations.bases-base-java-0.39.20-alpha.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at io.airbyte.integrations.destination.kafka.KafkaDestination.main(KafkaDestination.java:85) [io.airbyte.airbyte-integrations.connectors-destination-kafka-0.39.20-alpha.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is not set
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.common.security.JaasContext.defaultContext(JaasContext.java:131) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.common.security.JaasContext.load(JaasContext.java:96) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.common.security.JaasContext.loadClientContext(JaasContext.java:82) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:167) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:81) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:105) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.clients.producer.KafkaProducer.newSender(KafkaProducer.java:448) ~[kafka-clients-2.8.0.jar:?]
    2022-07-12 09:37:29 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 	at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:429) ~[kafka-clients-2.8.0.jar:?]
    Any idea please what I am missing here?
    n
    • 2
    • 2
1...525354...245Latest