https://linen.dev logo
Join Slack
Powered by
# troubleshooting
  • a

    Augustin Lafanechere (Airbyte)

    03/15/2022, 4:50 PM
    Hi @Sophie Lohezic, I think both of your assumptions are correct.
    JSON schema validation failed.
    means you might have a record that does not comply with the source connector schema and that was discarded. The other error is a red-herring, it's a flaky harmless error that we'll solve but pollutes the logs at the moment.
  • c

    Connor Lough

    03/15/2022, 9:40 PM
    Hey all, I made a connection between Postgres (source) and Snowflake (dest.). The sync says it's successful, but then doesn't show any schemas/tables? I made sure not to specify a specific schema in the connection so that all would be grabbed by default. Any ideas why? Is there a certain level of permissions my user needs on the postgres side?
    m
    j
    o
    • 4
    • 7
  • o

    Octavia Squidington III

    03/15/2022, 10:19 PM
    loading...
    m
    f
    a
    • 4
    • 5
  • f

    Felix

    03/15/2022, 10:41 PM
    Airbyte Version: 0.35.53-alpha Source name/version: Google Analytics or MsSQL Destination name/version: MsSQL It would be great if you'd created a github issue since this is actually a real UX flaw. The video shows a) in the first part that the front end is very unresponsive if you work with sources that do have more than 50 tables b) the uncheck/ check all box does not work at all - independently of the number of tables in a db: the second example uses google analytics = less than 15 tables https://www.loom.com/share/b23fba1b207d4012bab0f01c75f881d4
    m
    t
    • 3
    • 3
  • s

    Shubhank Vijayvergiya

    03/15/2022, 11:59 PM
    Airbyte Version: 0.35.46-alpha Source: GoogleAds (0.1.28) Destination: S3 (0.2.9) Description: While syncing
    ad_group_ads
    ,
    ad_groups
    , and
    campaigns
    connection is failing. We are suspecting that this is because of JSON schema validation. Attaching logs for reference. If true, can we skip the schema validation?
    o
    h
    • 3
    • 3
  • m

    Marcos Marx (Airbyte)

    03/16/2022, 1:16 AM
    @Artem Merkulov can you check this discussion here? https://airbytehq-team.slack.com/archives/C01MFR03D5W/p1647356085246729?thread_ts=1646981536.738209&cid=C01MFR03D5W
    a
    • 2
    • 4
  • m

    Marcos Marx (Airbyte)

    03/16/2022, 1:33 AM
    @Carlos Silva can you ask the cluster adm to see if it’s allow from a pod to launch another pods? the Airbyte Worker try to launch the source and destinations
    c
    m
    • 3
    • 11
  • m

    Matt Sterling

    03/16/2022, 2:10 AM
    have an instance deployed on a GCP compute instance - was running fine, but now getting I've restarted the server docker container, but still getting the same problem. Can anyone point me to where the logs can be found? Getting
    Copy code
    Cannot reach server. The server may still be starting up.
    h
    • 2
    • 2
  • a

    Adaparis Malibu

    03/16/2022, 2:58 AM
    Hi @Team, Good Morning ! I started using airbyte for poc work and looks likes right choice for us. We are pulling Shopify related data from airbyte. I have 2 queries specifically related to transformation and integrating Airbyte in our app 1. Is there a way to get rid of __airbyte__tmp tables in destination. We are using MySQL as destination. 2. We would like to get the api keys, creds to be needed for new Shopify account to be added from our custom app UI. Is this is a possibility ? If yes, can you please guide us. Thanks in advance.
    o
    h
    • 3
    • 2
  • s

    Samsudin Samsudin

    03/16/2022, 3:28 AM
    Hi all, I made a connection using EL (Extract and Load) process using Airbyte, and store the result into dataset "A" in BigQuery. then I used DBT to Transformation process to modify the tables, and I want to store the result from DBT to another dataset ("B") into BigQuery. is it possible if I Used transformation without schema config from dbt? because I want to store the table from Transformation process into different dataset name in BigQuery? maybe can I use --profile config customize?
    h
    a
    • 3
    • 3
  • m

    maxim

    03/16/2022, 9:31 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: Ubuntu 18.04, build on GCP -4 Memory / Disk: 2 CPU, 4Gb RAM / 50GB Balanced persistent disk Deployment: Docker Airbyte Version: 0.35.39-alpha Destination name/version: Postgres Step: Setting new destination Description: Hey everyone! I've got an error during Postgres destination setup. Why can't I set a schema name that starts with numbers?
    n
    a
    • 3
    • 6
  • g

    gunu

    03/16/2022, 10:19 AM
    Is this your first time deploying Airbyte: no OS Version / Instance: Linux EC2 m5.4xlarge Deployment: Docker Airbyte Version: 0.35.50-alpha Source: Survey Monkey 0.1.7 Destination: Snowflake 0.4.17 Description: Incremental + Dedupe results in duplicates rows for the responses stream
    o
    h
    +2
    • 5
    • 40
  • r

    rupesh padhye

    03/16/2022, 11:16 AM
    Copy code
    git clone <https://github.com/airbytehq/airbyte.git>
    
    Cloning into 'airbyte'...
    remote: Enumerating objects: 212751, done.
    remote: Counting objects: 100% (212746/212746), done.
    remote: Compressing objects: 100% (56667/56667), done.
    remote: Total 212751 (delta 114204), reused 210820 (delta 113130), pack-reused 5
    Receiving objects: 100% (212751/212751), 99.24 MiB | 1.14 MiB/s, done.
    Resolving deltas: 100% (114204/114204), done.
    Updating files: 100% (9318/9318), done.
     ~/learning  cd airbyte                            ✔  1m 44s  base   04:41:48 PM
     ~/learning/airbyte  master  docker-compose up   ✔  base   3.7.9   04:42:07 PM
    WARNING: The RUN_DATABASE_MIGRATION_ON_STARTUP variable is not set. Defaulting to a blank string.
    WARNING: The SECRET_PERSISTENCE variable is not set. Defaulting to a blank string.
    WARNING: The WORKER_ENVIRONMENT variable is not set. Defaulting to a blank string.
    Traceback (most recent call last):
      File "urllib3/connectionpool.py", line 670, in urlopen
      File "urllib3/connectionpool.py", line 392, in _make_request
      File "http/client.py", line 1255, in request
      File "http/client.py", line 1301, in _send_request
      File "http/client.py", line 1250, in endheaders
      File "http/client.py", line 1010, in _send_output
      File "http/client.py", line 950, in send
      File "docker/transport/unixconn.py", line 43, in connect
    FileNotFoundError: [Errno 2] No such file or directory
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "requests/adapters.py", line 439, in send
      File "urllib3/connectionpool.py", line 726, in urlopen
      File "urllib3/util/retry.py", line 410, in increment
      File "urllib3/packages/six.py", line 734, in reraise
      File "urllib3/connectionpool.py", line 670, in urlopen
      File "urllib3/connectionpool.py", line 392, in _make_request
      File "http/client.py", line 1255, in request
      File "http/client.py", line 1301, in _send_request
      File "http/client.py", line 1250, in endheaders
      File "http/client.py", line 1010, in _send_output
      File "http/client.py", line 950, in send
      File "docker/transport/unixconn.py", line 43, in connect
    urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "docker/api/client.py", line 214, in _retrieve_server_version
      File "docker/api/daemon.py", line 181, in version
      File "docker/utils/decorators.py", line 46, in inner
      File "docker/api/client.py", line 237, in _get
      File "requests/sessions.py", line 543, in get
      File "requests/sessions.py", line 530, in request
      File "requests/sessions.py", line 643, in send
      File "requests/adapters.py", line 498, in send
    requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "docker-compose", line 3, in <module>
      File "compose/cli/main.py", line 81, in main
      File "compose/cli/main.py", line 200, in perform_command
      File "compose/cli/command.py", line 60, in project_from_options
      File "compose/cli/command.py", line 152, in get_project
      File "compose/cli/docker_client.py", line 41, in get_client
      File "compose/cli/docker_client.py", line 170, in docker_client
      File "docker/api/client.py", line 197, in __init__
      File "docker/api/client.py", line 221, in _retrieve_server_version
    docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
    [36228] Failed to execute script docker-compose
    o
    m
    • 3
    • 2
  • r

    rupesh padhye

    03/16/2022, 11:17 AM
    I followed steps mentioned at https://docs.airbyte.com/deploying-airbyte/local-deployment giving this error
    a
    o
    • 3
    • 4
  • o

    Octavia Squidington III

    03/16/2022, 11:42 AM
    loading...
    h
    p
    • 3
    • 4
  • g

    Gabriel Kalunga

    03/16/2022, 4:17 PM
    We've tried it on both an OS M1 instance and a Linux instance and both give the same error
    h
    • 2
    • 3
  • b

    Bruna Cardoso Cota

    03/16/2022, 5:13 PM
    Hi everyone! I’m a Data Intern and I’m trying to connect the Facebook marketing api to airbyte. As written in documentation, I did get the permissions for ads_read and ads_management but fb doesn’t let me request for ads management standard access and so it still throttles the token. I wanted to know if this last permission is 100% necessary? or if anyone else had a similar problem and what was the solution found. Thanks!
    a
    • 2
    • 1
  • s

    Steve Reeling

    03/16/2022, 5:27 PM
    Hi All, I am sourcing from Postgres and am having a challenge with the date types being transferred to my destination MS SQL and Postgres. The dates transfer as strings. Here are the field definitions: • Postgres Source: created_at timestamp with time zone NOT NULL DEFAULT now(), • Postgres Destination: created_at character varying COLLATE pg_catalog."default" • MS MSQL Server Destination: [created_at] [varchar](max) NULL, Any guidance is appreciated. Thanks.
    o
    m
    • 3
    • 5
  • m

    Michael Gao

    03/16/2022, 6:30 PM
    Hi, Is there a guide on how to downgrade a connector? I upgraded Zendesk from 0.1.3 to 0.2.1 and my existing data is no longer syncing successfully.
    • 1
    • 1
  • s

    Salomon Dion

    03/16/2022, 7:25 PM
    Is this your first time deploying Airbyte: no OS Version / Instance: Linux Mint - Intel Core I5 8GB RAM 256GB ROM Deployment: Docker Airbyte Version: 0.35.50-alpha Description: Cannot deploy airbyte locally despite having available memory
    o
    m
    • 3
    • 4
  • r

    Rohit Sharma

    03/16/2022, 9:02 PM
    a
  • o

    Octavia Squidington III

    03/16/2022, 10:19 PM
    loading...
    t
    m
    r
    • 4
    • 7
  • o

    Octavia Squidington III

    03/17/2022, 1:13 AM
    loading...
    m
    m
    • 3
    • 5
  • k

    Kyle Cheung

    03/17/2022, 4:25 AM
    Is this your first time deploying Airbyte: No OS Version / Instance: EC2 t2.large, Deployment: Docker Airbyte Version: 0.35.55-alpha Source: Postgres Destination: Snowflake Description: After upgrading to 0.35.55-alpha can’t seem to auto deselect all streams. Had to one by one deselect the sync option on over 200 tables
  • k

    Kyle Cheung

    03/17/2022, 6:05 AM
    Schema refresh and resync did not do the trick either
    m
    a
    • 3
    • 9
  • m

    Manish Tomar

    03/17/2022, 6:24 AM
    I have installed Airbyte on my local Machine and I am trying to Connect MySQL with S3 Bucket. But when I am setting S3 as my Destination, It says Testing Failed
  • m

    Manish Tomar

    03/17/2022, 6:26 AM
    I have used Correct Access Key and Secret Key
  • m

    Manish Tomar

    03/17/2022, 6:26 AM
    What is the issue then ?
  • a

    Archita Dash

    03/17/2022, 6:26 AM
    I am getting similar issue.
  • a

    Archita Dash

    03/17/2022, 6:27 AM
    The source and destination setup is not at all being setup.
    m
    • 2
    • 5
1...456...14Latest