https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • s

    Shyngys Nurzhan

    04/15/2022, 4:11 AM
    Hi team, I did a PR on changing sanitize-html to DOMPurify. Could you check it please?
    m
    • 2
    • 1
  • k

    Keshav Agarwal

    04/15/2022, 12:54 PM
    Hi, when will this be released, it shows merged https://github.com/airbytehq/airbyte/pull/10905
    m
    • 2
    • 1
  • g

    Goffredo Lepori

    04/15/2022, 2:29 PM
    Hi, I am creating a custom connector for influxDb and I am wondering if it's possible/make sense to add sources using regular expression like all files in a folder or all tags in a measurements? Thanks for the advice
    m
    • 2
    • 2
  • h

    Harsh Vardhan

    04/15/2022, 6:40 PM
    Hi, I am using airbyte to sync data locally to csv from facebook marketing. I get this error after running for sometime and airbyte tries again. What may be causing this?
    logs-13.txt
  • j

    jijo

    04/16/2022, 12:21 AM
    hello, I am trying to connect airbyte with the source local CSV file and des local PostgreSQL. I am getting Error: Cannot create PoolableConnectionFactory (Connection to localhost:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.) - Any suggestions?
  • m

    Mikhail Masyagin

    04/16/2022, 3:21 AM
    Hello! I have connector to
    Zendesk Support
    and it runs extremely slow. It fails to fetch 1-years data of
    ticket_auditc
    , failing on 2-months with back-off. Also it has too many duplicates in data. The company, which I connect with
    Zendesk Support
    is really small, so I don't think, that it produces gigabytes of data each month. What to do with
    Zendesk Support
    ticket audits
    ??? Does anybody has this bug too? Everything is ok with
    Zendesk Support
    satisfaction ratings
    .
    o
    • 2
    • 1
  • t

    Taufiq Wahyu Ibrahim

    04/16/2022, 12:34 PM
    Hi all, I am trying to test Airbyte MySQL CDC using Sakila database as source and BigQuery Denormalized as destination. However, there's error when Airbyte trying to read table having
    decimal
    columns.
    Copy code
    Exception in thread "main" tech.allegro.schema.json2avro.converter.AvroConversionException: Failed to convert JSON to Avro: Could not evaluate union, field amount is expected to be one of these: NULL, DOUBLE. If this is a complex type, check if offending field: amount adheres to schema.
    Copy code
    mysql> describe payment;
    +--------------+-------------------+------+-----+-------------------+-----------------------------------------------+
    | Field        | Type              | Null | Key | Default           | Extra                                         |
    +--------------+-------------------+------+-----+-------------------+-----------------------------------------------+
    | payment_id   | smallint unsigned | NO   | PRI | NULL              | auto_increment                                |
    | customer_id  | smallint unsigned | NO   | MUL | NULL              |                                               |
    | staff_id     | tinyint unsigned  | NO   | MUL | NULL              |                                               |
    | rental_id    | int               | YES  | MUL | NULL              |                                               |
    | amount       | decimal(5,2)      | NO   |     | NULL              |                                               |
    | payment_date | datetime          | NO   |     | NULL              |                                               |
    | last_update  | timestamp         | YES  |     | CURRENT_TIMESTAMP | DEFAULT_GENERATED on update CURRENT_TIMESTAMP |
    +--------------+-------------------+------+-----+-------------------+-----------------------------------------------+
    The connection is working fine when I only replicate tables without
    decimal
    column. Thanks
    o
    • 2
    • 1
  • s

    Sami RIAHI

    04/16/2022, 1:53 PM
    I Need your help please I increase the fetch size of mssql connecter from 1K to 10K, after i try to rebuild the connecter but i have this erreur and i don't know what to do.
  • s

    Sami RIAHI

    04/16/2022, 1:54 PM
    I Need your help please I increase the fetch size of mssql connecter from 1K to 10K, after i try to rebuild the connecter but i have this erreur and i don't know what to do.
  • s

    Shanhui Bono

    04/17/2022, 7:06 PM
    How often are logs cleaned? Is there a way to check/update the setting? My instance has been up and running for 5 days and there's 5 GB of logs. I want to make sure my ec2 doesn't run out of storage (128 GB)
    m
    j
    • 3
    • 3
  • e

    Enrico Tuvera Jr.

    04/18/2022, 4:58 AM
    hey is airbyte generating
    modules.yml
    and
    module-settings.yml
    ? or is that something else
    m
    • 2
    • 1
  • a

    Aditya Tripathi

    04/18/2022, 8:44 AM
    hello everyone I am beginner in airbyte and have set up env in local and trying to create a custom destination(postgres) in python. spec and check fuction is created and is running fine. Just a question, how queries will be written in write function (main workhouse of destination connector) so it can recieve data from source and write in destination in new table. Should i have to use sql queries or what means how basically it communicates with source data and writes in destination datbase. Please help me in this. Thanks in advance
  • r

    Ramansh Sangal

    04/18/2022, 10:59 AM
    Hello Everyone, I am trying to develop a custom connector using Airbyte CDK
    m
    • 2
    • 1
  • r

    Ramansh Sangal

    04/18/2022, 11:01 AM
    I am getting a certain error, not sure wether its due to Airbyte or docker, Please have a look
  • r

    Ramansh Sangal

    04/18/2022, 11:01 AM
    image.png
  • h

    Harsh Vardhan

    04/18/2022, 12:37 PM
    Hi, I am working with the facebook marketing connector and facing some issues that I am not sure about. 1. When getting the ad insights data I am getting a lot more rows than I expected and as far as I can tell all with null values for metrics. 2. Most of them have starting dates before the dates that I put in. 3. None of the custom metrics were loaded - how can I add them to to the schema if possible Would love to connect with someone who has worked with facebook marketing connector before, thanks in advance
    s
    • 2
    • 1
  • k

    Kev Daly

    04/18/2022, 1:06 PM
    Hi all, I have a general question about Airbyte's sync mode functionality. I have a table in a Connection that I have had set to Incremental / Deduped + history because we had just wanted the most recent version of each record to replace the pre-existing one when changes occurred to pre-existing records. I had cursor field as the timestamp field in the table that identified when an existing record changed. In doing some testing, I realized that changes to the record in the destination table were not actually happening beyond the timestamp / cursor field, which was the only field updating in the destination table. Obviously this is a problem, so I would just love to understand more about best way to handle this case. I read the documentation pages on the different sync modes, but I still need to learn more. Thanks very much!
  • s

    Sahar Zelonagora

    04/18/2022, 4:42 PM
    👋 I would like to collaborate to work on a connector. what is the best way to have a private repo with all of the connector files? thanks Should I create a branch with airbyte as a remote?
    m
    • 2
    • 2
  • s

    Siva Kowsika

    04/19/2022, 2:01 AM
    Hello. Have a question re: Netsuite connector. Seems like it is the most upvoted source connector in demand. just curious, when is it scheduled for release? Thx
  • n

    Narender Kumar

    04/19/2022, 3:20 AM
    Hello..I want to contribute to the airbyte. Is there any documentation available to create development environment for the same so that issue can be debug and fix it?
  • d

    Dmitry Parpura

    04/19/2022, 7:13 AM
    Hello. I have a problem. I try to connect salesforce to airbyte (https://airbyte.com/tutorials/salesforce-zendesk-analytics) and on the last step i have a error
    curl: (6) Could not resolve host:
    What the problem? How to solve it ?
  • a

    Ashwini R

    04/19/2022, 7:16 AM
    👋 Hello, team!
  • a

    Ashwini R

    04/19/2022, 7:19 AM
    I am trying to load data from sql table to another sql table from different database , data got loaded to destination successfully , but in log it says failure and giving below error message,please help me in resolving issue
    j
    • 2
    • 5
  • d

    Dmitry Parpura

    04/19/2022, 9:01 AM
    Hello. I try to connect salesforce source and clickhouse destination. But i get the following error. How to solve this problem ?
    h
    • 2
    • 2
  • a

    Ashwini R

    04/19/2022, 10:04 AM
    Hi ,I have loaded data from sql to CSV , Sync was successful , but I am unable to find csv file in my local, Please guide me how to find csv ?
    j
    • 2
    • 2
  • a

    Ashwini R

    04/19/2022, 10:05 AM
    image.png
  • s

    Shah Newaz Khan

    04/19/2022, 1:20 PM
    Hi team, I am currently trying to deprecate
    minio
    from the
    gke k8s airbyte
    deployment and only use
    gcs
    for logging. When I remove the
    aws/s3
    specific envars from the worker pod I get the following error:
    Copy code
    Appender java.lang.IllegalStateException: No factory method found for class com.van.logging.log4j2.Log4j2Appender                                                                                                                             
            at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.findFactoryMethod(PluginBuilder.java:234)                                                                                                                          
            at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:134
    As far as I can tell, this is due to the the logging plugin unable to find the directory to log in, are there any other envar I need to set? These are the ones I removed from the
    worker.yaml envars
    Copy code
    - name: S3_LOG_BUCKET
      valueFrom:
        configMapKeyRef:
          name: airbyte-env
          key: S3_LOG_BUCKET
    - name: S3_LOG_BUCKET_REGION
      valueFrom:
        configMapKeyRef:
          name: airbyte-env
          key: S3_LOG_BUCKET_REGION
    - name: AWS_ACCESS_KEY_ID
      valueFrom:
        secretKeyRef:
          name: airbyte-secrets
          key: AWS_ACCESS_KEY_ID
    - name: AWS_SECRET_ACCESS_KEY
      valueFrom:
        secretKeyRef:
          name: airbyte-secrets
          key: AWS_SECRET_ACCESS_KEY
    - name: S3_MINIO_ENDPOINT
      valueFrom:
        configMapKeyRef:
          name: airbyte-env
          key: S3_MINIO_ENDPOINT
    - name: S3_PATH_STYLE_ACCESS
      valueFrom:
        configMapKeyRef:
          name: airbyte-env
          key: S3_PATH_STYLE_ACCESS
    f
    k
    • 3
    • 6
  • a

    Adit Modi

    04/19/2022, 1:32 PM
    Hi everyone, I am new to Airbyte, our team is looking to use airbyte for different sources - ranging from http api (web scraped website) to websites containing datasets like kaggle etc. we are looking to create custom connectors for these sources. I am looking for some guide on how to get started with this resources. More details in 🧵
    • 1
    • 2
  • e

    Eliot Salant

    04/19/2022, 2:59 PM
    Probably a simple question, but I can’t figure out how, using the Docker API, I can pipe the result of reading a csv file into writing a csv file. My simple example is:
    Copy code
    echo '{"type": "RECORD", "record": {"stream": "testing", "data": {"DOB": "01/02/1988", "First Name": "John", "Last NAME": "Jones"}, "emitted_at": 1650284493000}}' | docker run -v /Users/eliot/temp:/tmp airbyte/destination-csv write --catalog /tmp/airbyte_catalog.txt --config /tmp/airbyte_write.json
    
    2022-04-19 14:47:36 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {catalog=/tmp/airbyte_catalog.txt, write=null, config=/tmp/airbyte_write.json}
    2022-04-19 14:47:36 INFO i.a.i.b.IntegrationRunner(runInternal):105 - Running integration: io.airbyte.integrations.destination.csv.CsvDestination
    2022-04-19 14:47:36 INFO i.a.i.b.IntegrationRunner(runInternal):106 - Command: WRITE
    2022-04-19 14:47:36 INFO i.a.i.b.IntegrationRunner(runInternal):107 - Integration config: IntegrationConfig{command=WRITE, configPath='/tmp/airbyte_write.json', catalogPath='/tmp/airbyte_catalog.txt', statePath='null'}
    2022-04-19 14:47:36 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
  • r

    Romain LOPEZ

    04/19/2022, 5:40 PM
    Hey Airbyte team, been a while since using airbyte : Can you confirm if this is a good idea : I need to Dump a sqlserver db to an azure Postresql database (One shot) but the sqlserver db have networking enable for only my personnal computer: I was thinking installing an airbyte locally to process this extract .
1...343536...245Latest