https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Abdul Hameed

    12/21/2022, 8:57 AM
    Hi All, I am new to Airbyte and looking for some help. I have set an opensource Airbyte in my local machine by cloning the github repository and running the instance using docker-compose up. I have added source and destinations and also created a connection as well inside the transformation tab I have selected the radio button for Normalized tabular data, inside replication tab I have selected the Destination Namespace* as Mirror Source Structure and prefix is blank and Finally clicked the sync button to Run. I noticed that we have the data tables created in the Destination from Source but also it has created some new tables there that stores the data into RAW JSON format. Since I have selected only 2 tables as of now so there is not an issue, going forward if I select more tables then this could lead to space issues. The additional tables start with this prefix "_airbyte_raw_" and I have not mentioned this prefix any where in the process. For more info I have attached the screenshot below that shows the actual tables and JSON tables.
    m
    • 2
    • 9
  • v

    Vrushank Kenkre

    12/21/2022, 10:18 AM
    Hello, We are using S3 destination connector. In the settings we have specified the format as JSON Lines with no compression. But the files are sill getting stored in s3 with gzip compression. Can you please advice on how to fix this?
    n
    • 2
    • 5
  • a

    Alexandre Chouraki

    12/21/2022, 10:39 AM
    Hello, We're having issues any time we try to refresh a source schema :
    Error: Internal Server Error: No value present.
    This seems to be impacting all our connections. Airbyte version is 0.40.26. This is pretty critical, we could definitely use some help...
    m
    s
    s
    • 4
    • 8
  • a

    Andrzej Lewandowski

    12/21/2022, 1:08 PM
    Hey Everyone, I’ve just started using Airbyte. I move data from mysql to snowflake and I saw that json columns are converted to string not a variant. It’s possible to change this behaviour?
    m
    • 2
    • 1
  • c

    Chris Hoffman

    12/21/2022, 1:19 PM
    to which tags should I pay attention if I'm rocking the docker-compose version?
    n
    • 2
    • 10
  • a

    Artur Wagner

    12/21/2022, 1:27 PM
    Hey guys, I am trying to update my Airbyte from 0.35.12 to 0.43.6, using helm chart with external Postgres database, and through the database migration I get the following error:
    Migration of schema "public" to version "0.40.12.001 - AddWebhookOperationColumns" failed!
    Any clues on what to do to fix it?
    d
    • 2
    • 4
  • m

    Moaz ElNashar

    12/21/2022, 3:29 PM
    Hey guys, I've managed to deploy airbyte on GKE using helm charts
    0.43.6
    with default configurations. At first, I was able to add sources and destinations without any problem but suddenly all of them fails with
    non-json response
    error!
    m
    e
    • 3
    • 5
  • s

    Sebastian Brickel

    12/21/2022, 4:45 PM
    Hi team, I have locally modified an existing connector and uploaded its image to docker hub in order to use it on my production airbyte setup, which runs on GCP. When I add this new connector locally on my, rather old, 0.39.42-alpha, I have no problem. However on my production airbyte, which run 0.40.23 I receive
    Copy code
    Internal Server Error: Get Spec job failed.
    I went through the advice given on these sides https://discuss.airbyte.io/t/internal-server-error-get-spec-job-failed/2508 https://discuss.airbyte.io/t/unable-to-add-connectors-get-spec-job-failed/2184 https://github.com/airbytehq/airbyte/issues/12030 but nothing fixed it yet. To be clear.
    python main.py spec
    and
    python main.py --config --secrets/config.json
    work as indented. So does
    docker run --rm -i sbrickel/instagram:dev spec
    . Does anyone has any advice for me. Or an idea why I do not get the same error when adding the new connector locally? I did
    docker pull sbrickel/instagram:dev
    in my VM as well, as suggested by someone in the github issue Thank you
    a
    m
    d
    • 4
    • 18
  • s

    Sam Stoelinga

    12/21/2022, 5:49 PM
    Shout out to @Ivica Taseski for the very thorough round of reviews on a new destination I'm working on. He really took the time to understand every line of code in a PR of ~1400 lines of code and found several issues that would have impacted end-users. Sorry if this isn't the right forum for this and sorry if you didn't want this kind of recognition @Ivica Taseski
    octavia loves 5
    🙌🏽 1
    🙌 4
    i
    • 2
    • 1
  • n

    Nicholas Cowan

    12/21/2022, 5:55 PM
    Hi, has anyone here ever tried to use an Azure SQL Managed Instance as a source? My managed instance has a public endpoint configured, which allows me to connect to it from SSMS, but Airbyte gives me a non-json response error when trying to set it up as a source. Edit: So the connection works when I select "Standard" as the replication method, but fails when I choose CDC. I have CDC enabled on the database and all the tables in it, and SQL server agent is running.
    m
    • 2
    • 16
  • a

    anni

    12/21/2022, 8:02 PM
    Hi team, two questions when replicating data from Google Ads 1. for `ad_group_ad_report`: not all our active campaigns showed up in this table. And likely there’s 3 campaigns in the
    ad_group
    table is missing from
    campaign
    table 2. for `campaign`: the segments.hour are not counting from 0 to 23.
    s
    • 2
    • 2
  • t

    Tien Nguyen

    12/21/2022, 10:24 PM
    I am configuring an airbytescheduler for the connection via airbyte API. Can anyone please help me with the difference between schedule and scheduleData? If we set schedule, do we need to set the schedule Data and scheduleType? Thanks very much in advance
    m
    • 2
    • 2
  • s

    Soshi Nakachi仲地早司

    12/22/2022, 5:42 AM
    Hello teams. Q1. I am building an airbyte with GKE. (kustomize) I have it set up to spit out logs to GCS, but it is stuck as per the issue below. https://discuss.airbyte.io/t/worker-process-of-airbyte-v0-40-9-fails-to-start-on-custom-s3-config/2849 WORKER_LOGS_STORAGE_TYPE, WORKER_STATE_STORAGE_TYPE were tried with GCS and MINIO. Any progress on these? Q2. I would also like to spit out logs to GCS without creating a pod for
    airbyte-minio
    , is this possible? I don’t know the proper
    .env
    settings in that case.
    u
    • 2
    • 1
  • a

    Anandkumar Dharmaraj

    12/22/2022, 6:06 AM
    Hi team, I connected Fresdesk as a source with airbyte. I didn't mention any of the start date in connection configuration. I thought it will take the entire data from source , if i didn't mention the start date. But only last 30 days data is replicated through airbyte data pipeline from freshdesk to postgresql. How to get entire data from freshdesk, can you please help with me on this
    m
    • 2
    • 1
  • m

    Monika Bednarz

    12/22/2022, 7:14 AM
    Hi Team, happy Christmas season to all of you 🎅 🙂 We have a few daily connections set up in Airbyte and every once in a while they fail at the same time. They come from all different connectors and are always accompanied by an error in the platform. Could you please help on this? 🙏 Our ELT process gets blocled each time. Rerunning the syncs works veryr arely, it’s like the pods can’t get started at all. Logs below ⬇️
    m
    n
    • 3
    • 5
  • m

    Muhammad Imtiaz

    12/22/2022, 7:55 AM
    👋 Hello team, Quick question ---- I wanted to ask if we can lock the source/destination connector versions during Airbyte installation. Which means whenever I try to configure source, it should always pull the exact version (e.g incase of asana, 0.1.4). Quick response would be highly appreciated. cc: @Davin Chia (Airbyte) @Marcos Marx (Airbyte)
    d
    • 2
    • 4
  • m

    Manish Tomar

    12/22/2022, 8:34 AM
    Hello Team, Does anyone know how to access Airbyte Directory on EKS Cluster? I want to check/Access the file in Folder Airbyte>Charts>Airbyte>Values.yaml
    u
    • 2
    • 1
  • r

    Rohith Reddy

    12/22/2022, 9:02 AM
    Hello, I am an issue syncing bigquery data to redshift
    u
    m
    • 3
    • 2
  • v

    Valentyn Solonechnyi

    12/22/2022, 1:07 PM
    Hi team and community members, Hope someone can point out to the possible server configuration problem which prevents my Salesforce connector from fetching the schema:
    Copy code
    airbyte-server                      | 2022-12-22 12:54:54 INFO i.a.s.RequestLogger(filter):112 - REQ 192.168.0.5 POST 200 /api/v1/sources/discover_schema - {"sourceId":"eb207a5d-c37c-484d-a59e-293fde025113","disable_cache":true}
    airbyte-server                      | Dec 22, 2022 12:54:54 PM org.glassfish.jersey.server.ServerRuntime$Responder writeResponse
    airbyte-server                      | SEVERE: An I/O error has occurred while writing a response message entity to the container output stream.
    airbyte-server                      | org.glassfish.jersey.server.internal.process.MappableException: org.eclipse.jetty.io.EofException
    The tech details are here as well https://discuss.airbyte.io/t/failed-to-fetch-schema-for-salesforce-on-gcp/3519 It's working fine on my local machine, so it's definitely a GCP config problem with a proxy. Is there some parameter I could adjust to keep the connection alive and let the worker finish the process and return the schema?
    u
    • 2
    • 2
  • t

    Till Blesik

    12/22/2022, 3:13 PM
    Hi everyone, I am working on a connector using the low code configuration framework. The API I build the connector for has changing structures depending on the endpoint. I know that the record selector () can be used to specify the field but how can I account for different fields depending on the stream? For example, for the /model api, the response is
    { meta: {}, status: {}, models:{}}
    For /workspaces it is
    { meta: {}, status: {}, eirkspaces:{}}
    And for another type of endpoints with tabular data it is
    {pages: {}, columnCoordinates: {}, rows: {}
    How can I account for different response structures per stream? Can I specify a record selector per stream? Thank you for your help!
    n
    • 2
    • 5
  • r

    Rami M Theeb

    12/22/2022, 3:30 PM
    Hey everyone, i am facing issues connecting an external postgres db from azure to my airbyte( deployed on docker) appreciate any help https://discuss.airbyte.io/t/crash-loop-when-connecting-to-an-external-azure-postgres-db/3522
    s
    • 2
    • 1
  • m

    Mario Beteta

    12/22/2022, 5:17 PM
    Hi, is there any way to check the schematic of a connector locally? Just like it does in the first step when you try to import it to Airbyte and connect it to a destination. I can't debug it and I don't get any error apparently.
    s
    • 2
    • 1
  • j

    Jon M

    12/22/2022, 5:50 PM
    I'm trying to store to a MSSQL destination and I'm getting an error I don't understand:
    Failure Origin: replication, Message: Something went wrong during replication
    . Can anyone point me in the right direction please?
    s
    • 2
    • 1
  • a

    Anurag Jain

    12/22/2022, 6:00 PM
    1. Last time of the sync is not to be confused with the time up to which data is loaded. Account for replication lag at the source. 2. The sync schedule storage - where will it be maintained? 3. Assuming the structure comparison will be done before every load - with an option to configure an approval for new field addition 4. Sync modes / 2 / deduped history - destination doesn’t support primary keys as far I understand. Unless you have staged the data and using it to join. I don’t understand. Will need you to explain what are we doing here. 5. CDC can’t capture deleted records — a very big caveat. Correct me if I’m wrong.
    n
    • 2
    • 1
  • m

    Marcos Marx (Airbyte)

    12/22/2022, 6:45 PM
    Hello everyone I created one guide to show how to transfer an Airbyte instance to another server! https://discuss.airbyte.io/t/how-to-import-export-airbyte-to-a-new-instance-docker-to-docker-deploy/3514
    👍 3
    a
    • 2
    • 3
  • t

    Tyler Hogan

    12/22/2022, 7:43 PM
    Hey everyone, I currently am trying to set up S3 logging using the Airbyte Helm chart to deploy (v0.43.3). My deployment starts up fine but on testing source connections I get a “specified bucket does not exist” error. I’ve double checked the spelling of my bucket and even given it full public access just for testing permissions weren’t an issue. Am I missing something in setup? My values file:
    Copy code
    global:
      state:
        storage:
          type: "S3"
      logs:
        accessKey:
          password: "secret"
          existingSecret: ""
          existingSecretKey: ""
        secretKey:
          password: "secret"
          existingSecret: ""
          existingSecretKey: ""
        storage:
          type: "S3"
        s3:
          enabled: true
          bucket: my-bucket-name-here
          bucketRegion: us-east-2
    s
    n
    • 3
    • 5
  • j

    Jon M

    12/22/2022, 9:13 PM
    Is there a way to just store the data data when storing to MSSQL destination? I don't want all of the airbyte JSON output in my tables
    s
    • 2
    • 2
  • j

    Jon M

    12/22/2022, 9:43 PM
    Any clue why this would show up on the UI for Connections, Sources, and Destinations? I can go to settings, but I can only get this for the others. Logs aren't saying anything obvious, and I've restarted the whole compose a couple times.
    t
    s
    +3
    • 6
    • 15
  • r

    Robert Put

    12/22/2022, 9:56 PM
    Hello I just upgradted to the latest version of airbyte but now when refreshing schema i get this error:
    Copy code
    Failed to fetch schema. Please try again
    
    Error: Internal Server Error: No value present
    • all my connections, other than the connection i configured from the ui yesterday have the issue after upgrading ◦ postgres and stripe to snowflake ◦ latest version • 1 postgres connection does not have an issue and the only diff i can think of is that it was configured over the ui vs cli previous version was 0.40.17 any recent changes that would impact this?
    m
    • 2
    • 9
  • n

    Nicholas Cowan

    12/22/2022, 10:10 PM
    Hi everyone, I'm posting this here so that this issue is more easily searchable. Azure SQL Managed Instance with CDC is incompatible with the MSSQL
    v.0.4.26
    source connector. More details can be found in the Github issue.
    n
    • 2
    • 1
1...113114115...245Latest