https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Alexander Schmidt

    05/11/2023, 3:12 PM
    Heyho, we try to update airbyte from 0.40.27 to 0.42.1 (also tried with 0.41.0) but we get the following error: I tried it with new volumes but that doesn't help either. Does anybody had the problem before and or knows how to handle this?
    Copy code
    2023-05-11 13:44:21 ERROR i.a.b.Application(main):25 - Unable to bootstrap Airbyte environment.
    org.jooq.exception.DataAccessException: SQL [select * from "public"."actor_definition" where ("public"."actor_definition"."release_stage" is null or "public"."actor_definition"."release_stage" <> ?::"public"."release_stage" or "public"."actor_definition"."custom")]; Error while reading field: "public"."actor_definition"."allowed_hosts", at JDBC index: 23
    k
    • 2
    • 2
  • w

    William Linck

    05/11/2023, 5:36 PM
    Hi guys, I'm having an issue using the Zendesk Sell as a Source. I already have generated the Token in the Zendesk platform but when I test the connection in Airbyte I got this error:
    HTTPError('401 Client Error: Unauthorized for url: <https://api.getbase.com/v2/contacts>')
    . Does anybody know what could be causing this?
    k
    • 2
    • 2
  • s

    Sachin Patidar

    05/11/2023, 5:39 PM
    Hi All, we are tryin to setup connection with source as SFDC and Kafka destination and in connection when i am defining Destination Stream Prefix with “-” its getting converted with underscore and also is there any way to use regex for stream name like converting upper to lower case
    k
    • 2
    • 3
  • a

    Akash

    05/11/2023, 5:57 PM
    How can I move data from postgres to my data warehouse with the help of Airbyte.
    k
    • 2
    • 5
  • s

    Sachin Patidar

    05/11/2023, 6:42 PM
    Hi All, we are tryin to setup connection with source as SFDC and Kafka destination and in connection when i am defining Destination Stream Prefix with “-” its getting converted with underscore and also is there any way to use regex for stream name like converting upper to lower case
    k
    • 2
    • 2
  • t

    Thiago Villani

    05/11/2023, 6:57 PM
    Hello, I have a source connection with sql server, and destination postgresql, using incremental cdc. In my postgresql several tables were created _airbyte_raw... and _airbyte_tmp... Can I delete the _airbyte_tmp... table manually?
    k
    • 2
    • 2
  • a

    Angie Marable

    05/11/2023, 10:22 PM
    Hello - I am updating connection parameters to a postgres source. I have changed the host and password, everything else remains the same. However I a receiving a connection error I haven't encountered before.
    k
    • 2
    • 8
  • r

    Rahadian Djati

    05/12/2023, 4:41 AM
    hi all, need help how to I know Client ID, Client Secret, Refresh Token for Google Sheets?
    r
    • 2
    • 2
  • j

    Jesse Geron

    05/12/2023, 6:39 AM
    I see Airbyte mentions having a FreshService connector as a source, but I'm unable to find FreshService when trying to add a new source.
    k
    e
    • 3
    • 3
  • s

    Slackbot

    05/12/2023, 10:35 AM
    This message was deleted.
    k
    • 2
    • 2
  • a

    Akilesh V

    05/12/2023, 10:36 AM
    Hi All, how to set sync start time to specific time?
    k
    • 2
    • 5
  • t

    Thiago Comerlatto Rodrigues

    05/12/2023, 3:39 PM
    I would appreciate some help regarding a connection between Clickup and Postgres... When I fetch data from user and/or team i'm able to get a succesfull sync but when I add a folder or a task this error keeps popping up: 2023-05-12 153621 source > Syncing stream: folder 2023-05-12 153622 source > Backing off _send(...) for 0.0s (airbyte_cdk.sources.streams.http.exceptions.UserDefinedBackoffException: Request URL: https://api.clickup.com/api/v2/space/folder, Response Code: 500, Response Text: {"err":"invalid input syntax for integer: \"folder\"","ECODE":"OAuth_025"})
  • j

    Joel Olazagasti

    05/12/2023, 4:12 PM
    For connectors that use OAuth and require a redirect uri (in this case, Outreach), where you need to manually point the redirect uri in the application's oauth settings, what should we be setting the redirect uri to? Should this be a path back to our Airbyte instance? Do we need to have a separate OAuth service that handles the OAuth handshake for outreach?
    k
    b
    • 3
    • 3
  • v

    Valentin B.

    05/12/2023, 4:37 PM
    hi everyone 👋 simple question, why xml and zip are not supported in File Connector? any workaround?
    k
    j
    • 3
    • 9
  • t

    Thiago Comerlatto Rodrigues

    05/12/2023, 7:00 PM
    Hello everyone! is it possible to fetch subtasks from ClickUp connection?
    k
    • 2
    • 2
  • z

    Zach Loertscher

    05/12/2023, 7:27 PM
    Hey all - we are running Airbyte (v44.3) I'm getting this error message - any ideas? Loading from SQL Server -> Snowflake Failure Origin: normalization, Message: Normalization failed during the dbt run. This may indicate a problem with the data itself. looking at the logs, we are getting this error message for all of our integer column from the tables: 002108 (22000): SQL compilation error: cannot change column CONFIRMATIONOFBENEFITSTATUSASSIGNMENTID from type FLOAT to NUMBER(38,0)
    k
    • 2
    • 2
  • h

    Harshang Prajapati

    05/13/2023, 5:43 AM
    @here I have noticed that while transferring data from mongodb to elastic search i am facing an error like from _id(mongo) to _id(elastic search) error parsing. So what possible solution should be for that issue??? And also i think this issue will be for all NoSQL sources and destinations.
    k
    • 2
    • 2
  • t

    Thiago Comerlatto Rodrigues

    05/15/2023, 12:43 PM
    Hello everyone! I'm having a bit of a hard time fetching infos from Clickup. They work fine when i'm using power query for exemple but with Airbyte's configuration in source says that folder_id, space_id or list_id are optional fields but when left blank it returns a sync error. Another thing I've noticed is that it does not bring subtasks information. Can't understand why I have to narrow down the information if I wan't it to load to a DW.
    k
    • 2
    • 3
  • g

    Gabriel Levine

    05/15/2023, 3:50 PM
    Running Schema Discovery for NetSuite Connector (version 0.1.3) returns “non-json response” despite pod successfully completing. The existing connection continues to function fine but additional schema changes can’t be made. Airbyte version is 0.44.4
    k
    a
    • 3
    • 17
  • l

    laila ribke

    05/15/2023, 4:43 PM
    Hi all, I have error while syncing the Airbyte Cloud Shopify source connector, on the blogs stream. It seems that the data types defined in the connector are incorrect. Did someone work around it? or modified the connector and can pass me the image? Schema validation errors found for stream _blogs. Error messages: [$.feedburner_location: string found, but [null, integer] is required, $.feedburner: is an invalid date-time]
    k
    • 2
    • 2
  • m

    Madison Mae

    05/15/2023, 7:44 PM
    Has anyone had any luck solving this issue?
    normalization: handle records > 1MB for redshift SUPER type
    https://github.com/airbytehq/airbyte/issues/14573
    k
    m
    +2
    • 5
    • 14
  • i

    Ignacio Martínez de Toda

    05/16/2023, 7:04 AM
    Hi there! I’m trying to set up a new connection from MongoDB to Bigquery using already existing source and destination which i’m using in another connector and works properly. When it set it up and it’s fetching the source schema, it crashes with the error
    non-json response
    I test the source and destination and all the tests are correct, can someone help here? I’’ve deployed the latest version 0.44.4 on a google VM
    k
    • 2
    • 2
  • r

    Raunak Chowdhury

    05/16/2023, 7:15 AM
    Hi, I have upgraded the airbyte version to 0.44.3 and started getting the below error message, whenever, I try to create a new source or destination connection:
    Copy code
    [m i.a.w.p.DockerProcessFactory(create):136 - Creating docker container = source-chargebee-check-9da93b60-44b0-489e-9e8b-81251f7c6108-0-fqxzn with resources io.airbyte.config.ResourceRequirements@1934b36a[cpuRequest=,cpuLimit=,memoryRequest=,memoryLimit=,additionalProperties={}] and allowedHosts null
    2023-05-12 10:46:30 [32mINFO[m i.a.w.p.DockerProcessFactory(create):188 - Preparing command: docker run --rm --init -i -w /data/9da93b60-44b0-489e-9e8b-81251f7c6108/0 --log-driver none --name source-chargebee-check-9da93b60-44b0-489e-9e8b-81251f7c6108-0-fqxzn --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e WORKER_CONNECTOR_IMAGE=airbyte/source-chargebee:0.2.3 -e AUTO_DETECT_SCHEMA=true -e LAUNCHDARKLY_KEY= -e SOCAT_KUBE_CPU_REQUEST=0.1 -e SOCAT_KUBE_CPU_LIMIT=2.0 -e USE_STREAM_CAPABLE_STATE=true -e FIELD_SELECTION_WORKSPACES= -e WORKER_ENVIRONMENT=DOCKER -e AIRBYTE_ROLE= -e APPLY_FIELD_SELECTION=false -e WORKER_JOB_ATTEMPT=0 -e OTEL_COLLECTOR_ENDPOINT=<http://host.docker.internal:4317> -e FEATURE_FLAG_CLIENT= -e AIRBYTE_VERSION=0.44.3 -e WORKER_JOB_ID=9da93b60-44b0-489e-9e8b-81251f7c6108 airbyte/source-chargebee:0.2.3 check --config source_config.json
    2023-05-12 10:46:30 [32mINFO[m i.a.w.i.VersionedAirbyteStreamFactory(create):181 - Reading messages from protocol version 0.2.0
    2023-05-12 10:46:32 [1;31mERROR[m i.a.w.i.VersionedAirbyteStreamFactory(internalLog):313 - [Errno 2] No such file or directory: 'source_config.json'
    Traceback (most recent call last):
      File "/airbyte/integration_code/main.py", line 13, in <module>
        launch(source, sys.argv[1:])
      File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 131, in launch
        for message in source_entrypoint.run(parsed_args):
      File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 85, in run
        raw_config = self.source.read_config(parsed_args.config)
      File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/connector.py", line 51, in read_config
        config = BaseConnector._read_json_file(config_path)
      File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/connector.py", line 61, in _read_json_file
        with open(file_path, "r") as file:
    FileNotFoundError: [Errno 2] No such file or directory: 'source_config.json'
    Can someone please help me in resolving this issue?
    k
    • 2
    • 2
  • s

    Sergei Batishchev

    05/16/2023, 9:16 AM
    Hi all! Is there any way to get deal_type table (ID+Label) from Hubspot using AirByte? I I can read it in Python client.crm.properties.core_api.get_all('deals') So I expect to get them in Airbyte, but cannot find them. Did someone met this problem?
    k
    • 2
    • 3
  • n

    Nils de Bruin

    05/16/2023, 10:37 AM
    Hi all! I have the following case and I am curious if Airbyte could be applied in this situation. I have a customer who has data on an SFTP server. The files on the SFTP server are encrypted with a key and are named .csv.gpg. Is there a possiblity to transfer the files, decrypt them and have S3 as a destination, using Airbyte? Currently the .gpg files cannot be picked up by Airbyte as they are not in the form of .csv / .json as described in the SFTP connector. Thanks!
    k
    g
    • 3
    • 7
  • g

    geekwhocodes

    05/16/2023, 11:30 AM
    #C021JANJ6TY We have incremental sync setup with BigQuery destination (with default Normalized tabular data), normalization is duplicating last record on each sync. Can someone help us out here?
    k
    • 2
    • 2
  • b

    Benjamin Edwards

    05/16/2023, 11:31 AM
    Hi All, I have set up a connector between Google Ads and Snowflake. The connection is successful however I need to obtain metrics such as clicks and impressions on my Google Ads, I believe these are contained in the metrics table which Airbyte does not seem to have available to transfer. Is anyone aware of how to obtain Google Ad metrics using an Airbyte connector to Google Ads?
    k
    • 2
    • 2
  • i

    I J

    05/16/2023, 12:30 PM
    Hello! Can I use Airbyte to synchronize data from a Postgres database to BigQuery in real-time? From what I've seen, the syncs can be executed at most every 5 minutes, and if not, the Airbyte API would need to be used. Is this true, or is there another way to synchronize the data in real-time?
    k
    • 2
    • 5
  • b

    Benjamin Edwards

    05/16/2023, 3:58 PM
    Hi All, I am using an Airbyte connector from MySQL to Snowflake. The connection is frequently failing with the following exception:
    Copy code
    2023-05-16 14:50:37 [44msource[0m > ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Streaming result set com.mysql.cj.protocol.a.result.ResultsetRowsStreaming@3e48d38 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
    I wonder if there is a bug in the Java code that is not closing connections? Either way has anyone faced a similar issue or have any suggested workarounds? Thanks in advance.
    k
    c
    • 3
    • 5
  • r

    Ramkumar Vaidyanathan

    05/16/2023, 10:33 PM
    Hi, is there documentation on how we can use our RDS database with helm chart?
    k
    • 2
    • 3
1...192193194...245Latest