https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • p

    Payal Bansal

    03/08/2023, 7:15 AM
    We are trying to use airbyte with jira as source and postgres as destination. But sprint information not getting saved in db, whereas board info is. No idea as to why..
  • y

    yuan sun

    03/08/2023, 7:48 AM
    I want to try to write a new source and add it to airbyte. This new source is a database in China, dameng, what should I do?
  • y

    yuan sun

    03/08/2023, 7:49 AM
    DamengSource.java
    DamengSource.java
    m
    • 2
    • 1
  • l

    Lenin Mishra

    03/08/2023, 8:23 AM
    Hey folks, I am trying to build an airbyte connector for Zoho books. I am stuck on how to use the refresh token to regenerate the access token using the low CDK approach, as the access token has an expiration date. Can anyone suggest how to deal with this issue?
  • g

    Gerard Clos

    03/08/2023, 10:35 AM
    I've opened a couple of PRs regarding zoho connector, maybe you can take a look 🙏 https://github.com/airbytehq/airbyte/pull/23818 https://github.com/airbytehq/airbyte/pull/23863
    l
    • 2
    • 1
  • g

    Gurpreet Singh

    03/08/2023, 11:37 AM
    Hi folks, Am trying to deploy Airbyte to AWS EC2. I am facing an error of
    service "bootloader" didn't complete successfully: exit 255
    after the
    docker-compose up -d
    command. Am I missing something here?
    r
    a
    k
    • 4
    • 9
  • n

    Nick Saroki

    03/08/2023, 1:15 PM
    Anyone out there using the Amazon Seller Partner API integration for
    GET_FLAT_FILE_ALL_ORDERS_DATA_BY_LAST_UPDATE_GENERAL
    ? I'm not certain what the best sync mode would be here... An amazon-order-id will have changes in order-status and item-status, and I don't want the historical statuses. If I do Incremental | Deduped + history w/ a PK of amazon-order-id, I can't have orders w/ multiple rows. I'm not sure what else I'd need to add to the PK to make it safely update? Does Amazon ever put the same sku in more than one row?
  • m

    Magnus Fagertun

    03/08/2023, 1:18 PM
    We're creating a connector with the CDK, and I'm wondering how soon do we need to worry about pagination? Do we need to use it for 50 objects, or 100, each with 50 fields?
    a
    • 2
    • 1
  • r

    Rachel RIZK

    03/08/2023, 1:57 PM
    Hello Airbyte team! I've opened a PR to fix missing columns in Bing Ads reports: • changes are not complex • impact: users will be able to do trend analysis with this (including on deleted campaigns) Is anyone available to review and help me launch integration tests? (I think it's legacy and needs
    /test connector=connectors/source-bing-ads
    ) Also this connector is GA 👀 Thanks 🙏
  • t

    Tmac Han

    03/08/2023, 2:17 PM
    Hi team , would you like to help to merge this pr https://github.com/airbytehq/airbyte/pull/22855 ? It't just a pip upgrade.
    n
    • 2
    • 1
  • a

    Andre Santos

    03/08/2023, 2:33 PM
    Hello everyone. I'm checking here about collecting metrics to Datadog, I was reading the documentation section https://docs.airbyte.com/operator-guides/collecting-metrics/ I'm trying to see the list of available metrics, but the link seems to be broken... Can you gimme some help with that?
    m
    • 2
    • 1
  • r

    Ronen Konfortes

    03/08/2023, 2:52 PM
    Hey guys 👋 There was a commit of mine with a change to the Jenkins source merged in https://github.com/faros-ai/airbyte-connectors 2 months ago. I'm running my own Airbyte instance on K8s and I seem to lack some knowledge on how this connection (Airbyte -> External Connectors) is made When browsing to
    Settings/Sources
    There seems to be a Jenkins source with latest version (
    0.1.23
    ) released 2 years ago I deployed Airbyte using the latest Helm chart (
    airbyte-0.44.5
    ) What am I missing? How do I get external connectors changes to my Airbyte instance?
    m
    • 2
    • 2
  • s

    Sivaprathap S

    03/08/2023, 2:59 PM
    When I connect Jira as a Source and Postgres as a Destination via Airbyte. After making the connection, most of the tables in the Postgres DB are empty (sprints, etc.,). Can I know the reason, please?
    m
    • 2
    • 4
  • g

    Gurpreet Singh

    03/08/2023, 3:07 PM
    @Romain Billot Thanks for the help. It seems that the error is with the latest release version. I manually installed the latest compose as you helped. later upon further digging into logs found the error related to
    Failed to fetch local catalog definitions
    r
    • 2
    • 5
  • g

    Gurpreet Singh

    03/08/2023, 3:09 PM
    Can somebody help me with the previous version release url to be used in the wget command as I feel that will help me move forward while this is looked at by Airbyte team in Github.🙏
    t
    • 2
    • 4
  • j

    Jorge Barrionuevo

    03/08/2023, 3:19 PM
    Hello everyone, I'd like to know if it's possible to integrate Hashicorp Vault with Airbyte to manage secrets, thank you!
    m
    • 2
    • 3
  • i

    Ismail Tigrek

    03/08/2023, 3:46 PM
    I'm getting this error when trying to start Airbyte
    r
    g
    +3
    • 6
    • 9
  • m

    Mickaël Andrieu

    03/08/2023, 4:11 PM
    Hi, I have a weird issue with the Google Sheets connector: for some reason it doesnt import all rows it reads. Any idea ? Should I play with the value of batch size ? This sheet is built using QUERY function from another sheets of this file.
    Copy code
    2023-03-08 16:08:05 source > Row counts: {[...], 'workshops': 12038}
    2023-03-08 16:08:13 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):347 - Records read: 1000 (541 KB)
    2023-03-08 16:08:13 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):347 - Records read: 2000 (1 MB)
    2023-03-08 16:08:13 source > Fetching range workshops!2003:4003
    2023-03-08 16:08:13 INFO i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):359 - Total records read: 2363 (1 MB)  => WHY NOT 12038 lines, or at least 4003 ????
    n
    • 2
    • 4
  • n

    Neethu Thomas

    03/08/2023, 4:24 PM
    Hi, I couldn't see these
    _ab_cdc_log_pos
    ,
    ab_cdc_log_file
    ,
    _ab_cdc_deleted_at
    ,
    _ab_cdc_updated_at
    columns in our new Airbyte tables. Can anyone help me with this?
    m
    • 2
    • 2
  • j

    James Salmon

    03/08/2023, 4:37 PM
    I am building a Dagster integration with Airbyte which will kick off a downstream process. I’d like to be able to manage the Sync timing from Airbyte though, and use a Dagster Sensor to know when the sync has completed. Is there an API endpoint which can tell me this from Airbyte, and if so where can I find documentation on it? I know Dagster can schedule the sync in Airbyte, and will then monitor the sync until completed running downstream assets afterwards, but in my case I want to sync in Airbyte, monitor connections and then when a sync completes run downstream assets
    m
    • 2
    • 2
  • y

    Yusuf Ogunjobi

    03/08/2023, 4:38 PM
    Hello 👋 all, I have a question regarding Airbyte upgrade when deployed with docker on an EC2 instance and a cloud managed Postgres instance. This statement here is not really clear to me. Is there any risk of loosing config data when using a cloud managed Postgres Instance?
    m
    n
    • 3
    • 2
  • j

    José Lúcio Zancan Júnior

    03/08/2023, 5:58 PM
    I'm having a problem with the normalization on the BigQuery destination. It doesn't seem to be specific to an source (since I had this problem before with other sources, like Facebook and TikTok), but now I'm having trouble with a specific connection between Google Ads and BigQuery. At every sync attempt, i'm getting an error. General message:
    Failure Origin: normalization, Message: Normalization failed during the dbt run. This may indicate a problem with the data itself.
    On the logs, I'm seeing a repeating error:
    Copy code
    Pickling client objects is explicitly not supported.
    Clients have non-trivial state that is local and unpickleable.
    I'm attatching the log file in this message, but the errors seems to start on line 1201 (when the sync starts to interact with the BigQuery. What I already tried, with no success: • Delete and recreate the connection • Delete and recreate the source • Delete and recreate the destination • Delete and recreate the dataset on BigQuery • Change the dataset location (on BQ and AirByte config) between US and us-central1 • Give BigQuery Administrator IAM role to the Service Account used by the destination • Change the loading method of the destination between GCS Staging and Standards Inserts • Change between batch and interactive on Transformation Query Run Type What worked, but it's kind of unfeasible for us (i'm keeping this only on the last resort): Choosing the transformation to Raw data (instead of Normalized tabular data) Keypoints: • The same Source works fine with a Google Cloud Storage destination • The same type of destination (BigQuery) works fine with another sources, with the same configurations, even the sync mode (changing only the source) • The Sync mode of this connection is
    Incremental | Dedup + history
    and
    Full refresh | Overwrite
    on the
    account_performance_report
    stream • We are using the v0.41.0 on GKE with Helm, but this problem is ocurring since the v0.40.30 (the first-one that we installed) Thanks in advance.
    m
    n
    • 3
    • 2
  • j

    José Lúcio Zancan Júnior

    03/08/2023, 6:00 PM
    I forgot the file 😁
    f697802f_7e37_4395_87f0_8b2c20251504_logs_904_txt.txt
  • j

    Jeff Jolicoeur

    03/08/2023, 7:18 PM
    Where can we find the list of metrics Airbyte OSS sends to Datadog now? Documentation points to a file that no longer exists https://github.com/airbytehq/airbyte/blob/master/airbyte-metrics/metrics-lib/src/main/java/io/airbyte/metrics/lib/OssMetricsRegistry.java
    s
    • 2
    • 1
  • m

    Matheus Barbosa

    03/08/2023, 10:00 PM
    Hi guys, I have a question regarding Powered By Airbyte endpoints. I already created a source (successfully) and also created a connection, everything okay except the connection I just created throug the API is disabled by default. So when I try to trigger a sync it doesn’t work as expected and I receive a 500 status error. How can I create the connection with enabled status or update it in the API without having to do in the UI interface. I have lots of users it will be like hell to enable all the connections manually haha
    b
    • 2
    • 6
  • m

    Michael Biriley

    03/08/2023, 10:14 PM
    Hi #C021JANJ6TY.. I'm deploying Airbyte OS on EC2. yay Its all running smoothly. 😄 Now I want to switch to using AWS secretsmanager to store credentials for sources and destinations. Is adding
    SECRET_PERSISTENCE="AWS_SECRET_MANAGER"
    to the .env file I'm using to configure the docker stack going to work? Will it migrate existing secrets to AWS? I've already confirmed the ec2-instance role has IAM permissions to SecretManager. Let me know if there are any docs I can refer to. 😄 🙏
    m
    • 2
    • 4
  • r

    Rachid Belaid

    03/08/2023, 10:22 PM
    I recently upgraded airbyte from 0.40.0 to 0.40.39 and def getting much more memory issues 😕
  • y

    yuan sun

    03/09/2023, 3:35 AM
    How to solve this problem, anyone know
    ✅ 1
  • v

    VISHAL B

    03/09/2023, 6:46 AM
    Hello team, I was loading records from mysql to bigquery and all the records was loaded successfully but there is a error in sync log Why this error is occurring?!
    n
    • 2
    • 9
  • v

    VISHAL B

    03/09/2023, 6:50 AM
    And any update on this error?!
1...157158159...245Latest