https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • m

    Maxime Sabran

    02/16/2023, 12:14 PM
    Hello! Is it possible to import airbyte configuration in the new version ?
    u
    • 2
    • 1
  • m

    Matthieu Rousseau

    02/16/2023, 1:04 PM
    Hello guys ! We are facing this issue when we try to install the dependencies in the Low-code connector development (Step 2
    python main.py spec
    ) :
    Copy code
    Traceback (most recent call last):
      File "/Users/matrousseau/Programming/Python/beamy/airbyte/airbyte-integrations/connectors/source-netskope/.venv/lib/python3.10/site-packages/airbyte_cdk/sources/declarative/manifest_declarative_source.py", line 141, in _validate_source
        validate(self._source_config, declarative_component_schema)
      File "/Users/matrousseau/Programming/Python/beamy/airbyte/airbyte-integrations/connectors/source-netskope/.venv/lib/python3.10/site-packages/jsonschema/validators.py", line 934, in validate
        raise error
    jsonschema.exceptions.ValidationError: 'DpathExtractor' is not one of ['CustomRecordExtractor']
    
    Failed validating 'enum' in schema[0]['properties']['type']:
        {'enum': ['CustomRecordExtractor'], 'type': 'string'}
    
    On instance['type']:
        'DpathExtractor'
    Do you have any information on this?
    n
    r
    • 3
    • 3
  • g

    Gunnar Lykins

    02/16/2023, 2:24 PM
    Hello - just wanted to see when I should expect for this PR to be merged? It looks like its approved just hasn’t been merged to master. If this can be done ASAP, that would be great as this is blocking some provisioning via Helm on my application. Thanks!
    s
    • 2
    • 2
  • d

    Domenic

    02/16/2023, 3:19 PM
    Hey folks.. I am using an OracleDB (source) connecting to AzureBlobStorage (Destination). I have been successful executing a sync, but when I check the destination (AzureBlobStorage), it is stored as a file with no extension. I want azure to recognize that it is csv file
    h
    u
    • 3
    • 2
  • k

    Kyle Cheung

    02/16/2023, 3:24 PM
    Has anyone had success using the BambooHR connector to pull a custom field? Mine always comes through as
    NULL
  • d

    Dhruv Satish

    02/16/2023, 3:55 PM
    Hi Team, Just a small question regarding DBT, I did use DBT with Airbyte which is running in my local and everything is working as expected. Just that when I am trying to access the same DBT with Prod configuration to Airbyte ( which is running on Plural and Kubernetes ). I am encountered with an error - dbt > entrypoint.sh: line 6: cd: git_repo: No such file or directory. Wanted to check if Custom DBT works fine with Plural ? Attaching a screenshot.
    n
    • 2
    • 2
  • g

    god830

    02/16/2023, 4:09 PM
    Not sure if you are aware of the Zoom JWT deprecation. https://marketplace.zoom.us/docs/guides/build/jwt-app/
    m
    • 2
    • 1
  • a

    Anchit

    02/16/2023, 7:14 PM
    Hello! I'm trying to set up Iceberg as a destination locally. I see Iceberg only supports Jdbc, Hive and Hadoop Catalogs currently, not REST Catalog. Airbyte docs From the Tabular blog, it seems like they are pushing for the REST catalog over previous jdbc approaches (reasons in the blog). Wanted to check if anyone has worked with Airbyte <> Iceberg (with Rest Catalog).
    n
    • 2
    • 1
  • l

    Lucas Wiley

    02/16/2023, 7:24 PM
    Can custom dbt transformations be used to hash data from a specific connector so that PII is never exposed in the destination? Any examples of this out there? E.g. if I want to hash PII on a stream, I am thinking I could create a separate github dbt repo that runs hashing on target columns, and hashes them before they are loaded into the destination table. Though I imagine there may still be PII exposed in the tmp tables that Airbyte uses for staging the data.
    🙏 1
    n
    • 2
    • 5
  • h

    Harshil Dhariwal

    02/16/2023, 9:04 PM
    Hey folks! I have been using AirByte through the API interface and intermittently using the UI. It is amazing what you people have accomplished. Recently I came across an issue that has been a blocker. I am trying to use ClickHouse as the source connection and s3 as destination. Somehow, the first sync doesn’t emit any records. However, sometimes the 2nd or 3rd is able to retrieve the records. Now, I have been using a similar setup with Snowflake, Azure and BigQuery and everything seems to be running smoothly. So I suspect the culprit is the connector. Can someone help me with this ?
    u
    • 2
    • 2
  • d

    Danny Steffy

    02/16/2023, 9:38 PM
    Hello! Trying to setup a sync from a MSSQL db to a local Postgres db, using incremental dedupe+with history on an integer column. I was able to get it functioning for 2 tables in a single connection, but it seems like it's running pretty slowly, and I'm not sure what to look for as far as trying to speed it up. The initial sync was completed in 28m35s for 407k records. I would like to sync some additional tables that are in the millions of rows range. The output of the log seems to be spamming
    Generated legacy state for x streams
    . Am I missing some sort of configuration to speed up the sync? I'm not sure what that log is denoting either. Thanks for the help!
    s
    • 2
    • 2
  • w

    wp

    02/16/2023, 9:56 PM
    Anyone using Google Universal Analytics have experience with the
    isDataGolden
    flag not turning true at all? I understand that the connector adds a 2 day lookback window to compensate for the data processing latency but on my daily syncs,
    isDataGolden
    have not returned true on the past days and it goes on to sync new days and skip over the day with the data that is still
    isDataGolden = false
    n
    • 2
    • 3
  • s

    Sam Kmetz

    02/16/2023, 9:58 PM
    I haven’t seen anything in documents or in the code itself, but is there any option to specify transaction isolation on MS SQL? I have a case where I need to read uncommitted or else risk locking the entire database while pulling data. If there is no locking issue due to some other mechanism (not a java expert here, just been weeding through code) please let me know because as it stands now the MSSQL DBA is saying airbyte is a non starter unless we can have it specify read uncommitted.
    m
    w
    • 3
    • 7
  • j

    Jesse Myers

    02/16/2023, 11:52 PM
    Hi everyone, I'm new to airbyte and trying to sync some data from mssql to postgres, the process works when my databases are publicly accessible, however my use case requires me to connect to databases attached to a private docker overlay network, I've attacked the
    worker
    service to the network in question, but the connector fails to establish a connection. I can see that the worker spins up a source and destination container how can i configure those to be attached to the network in question?
    n
    • 2
    • 2
  • a

    Arik Elbag

    02/17/2023, 12:24 AM
    Getting this error when trying to connect SFDC “Failure Origin: source, Message: Implementation restriction: EntitySubscription only allows security evaluation for non-admin users when LIMIT is specified and at most 1000” What can I do to fix this?
    j
    j
    • 3
    • 5
  • m

    Mohamed Anas

    02/17/2023, 3:00 AM
    Hi Team, I am trying to build my own connector for SIKKA api. generate.sh doesn't work as expected. I havent modified it but still get this error. Any help would be appreciated
    Copy code
    #!/usr/bin/env bash
    
    error_handler() {
      echo "While trying to generate a connector, an error occurred on line $1 of generate.sh and the process aborted early.  This is probably a bug."
    }
    trap 'error_handler $LINENO' ERR
    
    set -e
    
    # Ensure script always runs from this directory because thats how docker build contexts work
    cd "$(dirname "${0}")" || exit 1
    
    # Make sure docker is running before trying
    if ! docker ps; then
      echo "docker is not running, this script requires docker to be up"
      echo "please start up the docker daemon!"
      exit
    fi
    
    _UID=$(id -u)
    _GID=$(id -g)
    # Remove container if already exist
    echo "Removing previous generator if it exists..."
    docker container rm -f airbyte-connector-bootstrap >/dev/null 2>&1
    
    # Build image for container from Dockerfile
    # Specify the host system user UID and GID to chown the generated files to host system user.
    # This is done because all generated files in container with mounted folders has root owner
    echo "Building generator docker image..."
    docker build --build-arg UID="$_UID" --build-arg GID="$_GID" . -t airbyte/connector-bootstrap
    
    # Run the container and mount the airbyte folder
    if [ $# -eq 2 ]; then
      echo "2 arguments supplied: 1=$1 2=$2"
      docker run --name airbyte-connector-bootstrap --user "$_UID:$_GID" -e HOME=/tmp -e package_desc="$1" -e package_name="$2" -v "$(pwd)/../../../.":/airbyte airbyte/connector-bootstrap
    else
      echo "Running generator..."
      docker run --rm -it --name airbyte-connector-bootstrap --user "$_UID:$_GID" -e HOME=/tmp -v "$(pwd)/../../../.":/airbyte airbyte/connector-bootstrap
    fi
    
    echo "Finished running generator"
    n
    • 2
    • 3
  • a

    Arvind Pai

    02/17/2023, 3:23 AM
    Hi, has anyone successfully run airbyte behind a proxy?
    m
    a
    • 3
    • 4
  • v

    vismaya Kalaiselvan

    02/17/2023, 6:01 AM
    Hi everyone!Is there a way to refresh the schema of a table in airbyte using api calls?
    u
    • 2
    • 1
  • m

    Madhu Prabhakara

    02/17/2023, 6:03 AM
    Hey Everyone
  • m

    Madhu Prabhakara

    02/17/2023, 6:04 AM
    We were running an instance of airbyte on ec2 instancea nd suddenly I see that all syncs fail with a message "Waiting for logs"...I tried restarting the containmers but it didn't help. Anyone any clue on what might be happening?
    u
    • 2
    • 1
  • m

    Muhammad Imtiaz

    02/17/2023, 6:11 AM
    Hello Team,
    I need help regarding parsing airbyte-worker logs using Fluentd. Logs are not in JSON format hence It requires lot of effort to parse them in fluentd.
    I've search a lil bit around how airbyte-worker produces logs, and it turns out that it primarily uses Logback (which is successor to Log4j) framework. I'm trying to find the logback.xml file which suppose to be present in
    airbyte-config
    directory, but I couldn't find it there. I wanted to see how can I format these logs in JSON format by changing the xml file. Since I've deployed airbyte in K8S, so I'm really really interested if the above logging format is parameterized and I want to change them from
    .env
    file. Though I've seen some of environment variables in .env file.
    Copy code
    ### LOGGING/MONITORING/TRACKING ###
    TRACKING_STRATEGY=segment
    JOB_ERROR_REPORTING_STRATEGY=logging
    # Although not present as an env var, expected by Log4J configuration.
    LOG_LEVEL=INFO
    Thanks in advance!
    ✅ 1
    d
    • 2
    • 4
  • o

    Ohad

    02/17/2023, 6:32 AM
    Hey Team, I need your help, please! I'm trying to connect to MSSQL using SSH Tunnel following these instructions. Using my VM (located on Azure) I can get SSH to the Jump server (located on my clients network), and tunnel SQL queries using SQLCMD over port 1433. No issues. However, I can't get this to work with Airbyte MSSQL connector (version 0.4.28). I get this cryptic message
    non-json response
    . When I check the logs I see the error
    Server at /xxx.xxx.xxx.xxx:22 presented unverified EC key: SHA256:cl8s2xxxx
    I can't find this key anywhere on my jump server, not sure of its source. I'd appreciate any ideas/help with this 🙏 PS. Airbyte version is 0.40.25
    n
    • 2
    • 3
  • s

    sam pai

    02/17/2023, 6:44 AM
    My issue : https://github.com/airbytehq/airbyte/issues/22546 Can someone show me the screenshot if your bingads source is work on opensource docker version. I check the source code /usr/local/airbyte/airbyte-oauth/src/main/java/io/airbyte/oauth/flows/MicrosoftBingAdsOAuthFlow.java It need to fetch access token... but as I know it also need to login/password on https://login.microsoftonline.com/common/oauth2/v2.0/authorize?client_id=XXX My Azure application redirect_uri=https://login.microsoftonline.com/common/oauth2/nativeclient How does it work ? How to config my bingads on airbyte?
    u
    • 2
    • 2
  • l

    Lucio Melito

    02/17/2023, 6:53 AM
    Hey everyone! I have a postgres-BigQuery connection with some geospatial data in EWKB (extended WKB) format, which BigQuery does not recognize. Is there any way to apply some transformations to those columns in the postgres db before they get imported in BQ? Thanks!
    q
    n
    • 3
    • 2
  • g

    George R

    02/17/2023, 9:15 AM
    Hello everyone! I'm syncing reports from TikTok to postgres but there is something weird going on. None of the data actually matches TikToks dashboard or manual API requests. Is there something specific that i need to look for? Maybe a timezone offset in the airbyte requests?
    u
    • 2
    • 1
  • m

    Mark

    02/17/2023, 12:42 PM
    Hi, has anyone deployed Airbyte within ECS Fargate? Currently where I am we are not allowed EKS.
    u
    • 2
    • 1
  • u

    user

    02/17/2023, 12:45 PM
    Are you able to create another source/destination? What version of Airbyte are you using?
  • o

    Oliver Meyer

    02/17/2023, 1:25 PM
    Hi 🙂 I just opened a PR to fix a bug in the S3 source. I'm not sure how to run the integration tests as they seem to connect to stuff to which I don't have access. Can someone help me out? https://github.com/airbytehq/airbyte/pull/23195 this bug is causing us a headache in production by crashing pods because of the amount of logs the source generates, so I'd love to get this released ASAP. Thanks!
    n
    • 2
    • 5
  • w

    Walker Philips

    02/17/2023, 4:07 PM
    Easy question, does the "Reset your Data" button reset the state of the sync? In my personal experience it doesnt seem to. Either way, I think that is an important bit of knowledge to have on the documentation page thats not currently there.
  • c

    Christina

    02/17/2023, 5:04 PM
    Hi Airbyte, we are currently using Fivetran but considering an alternative. Does Airbyte support Databricks Serverless SQL endpoint as a destination at the moment?
    m
    n
    • 3
    • 2
1...145146147...245Latest