https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • r

    Rahul Borse

    10/27/2022, 7:21 AM
    Hello everyone, did anyone tried securing airbyte instance with keyclock or oauth or any other? Can someone help me out with the resources which are available for airbyte security. Not the one which is available in airbyte website.
    h
    m
    • 3
    • 7
  • c

    Carlos Santini

    10/27/2022, 9:30 AM
    hi there. I'm setting up Airbyte v0.40.33-helm on AWS EKS and using an external Aurora Postgres 10.20 for its metadata. However, I'm having issues in the bootlaoder pod failing with . Does anyone knows what might be the issue? or if we need to use a specific RDS PG version?
    Copy code
    2022-10-27 09:21:09 INFO i.a.c.EnvConfigs(getEnvOrDefault):1079 - Using default value for environment variable RUN_DATABASE_MIGRATION_ON_STARTUP: 'true'
    2022-10-27 09:21:09 INFO i.a.b.BootloaderApp(runFlywayMigration):342 - Migrating configs database
    2022-10-27 09:21:09 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Flyway Community Edition 7.14.0 by Redgate
    2022-10-27 09:21:09 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Successfully validated 27 migrations (execution time 00:00.013s)
    2022-10-27 09:21:09 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Current version of schema "public": 0.40.11.002
    2022-10-27 09:21:09 INFO o.f.c.i.l.s.Slf4jLog(info):49 - Migrating schema "public" to version "0.40.12.001 - AddWebhookOperationColumns"
    2022-10-27 09:21:09 INFO i.a.d.i.c.m.V0_40_12_001__AddWebhookOperationColumns(migrate):22 - Running migration: V0_40_12_001__AddWebhookOperationColumns
    2022-10-27 09:21:09 ERROR o.f.c.i.l.s.Slf4jLog(error):57 - Migration of schema "public" to version "0.40.12.001 - AddWebhookOperationColumns" failed! Changes successfully rolled back.
    Caused by: org.jooq.exception.DataAccessException: SQL [alter type "operator_type" add value 'webhook']; ERROR: ALTER TYPE ... ADD cannot run inside a transaction block
    s
    • 2
    • 3
  • t

    Truc Nguyen

    10/27/2022, 9:44 AM
    Hi everyone, in the instruction for docker deployment it shows an postgresql url for the DATABASE_URL env variable. Can I use MySQL for that
    m
    • 2
    • 2
  • g

    Georg Heiler

    10/27/2022, 10:31 AM
    how can I include customEvent dimensions in the airbyte export from Google Analytics 4 (they are part of the ecommerce items of GA)
    s
    n
    c
    • 4
    • 13
  • m

    Manish Tomar

    10/27/2022, 10:32 AM
    Hello everyone we moved from (DMS->Redshift) stack to now (Airbyte -> Snowflake) but the overall cost shoot up by 4 Times. Our Airbyte is deployed on AWS EKS, and we are unable to understand what went wrong, I want to know what are the things I should check? What are the best practices? BTW I'm using Small size Warehouse to load data into Snowflake (on Average it is taking 55 min to fetch a 15 GB Table)
    • 1
    • 2
  • j

    Julien Calderan

    10/27/2022, 12:29 PM
    Hello everyone, We are running Airbyte as a docker-compose application on a VM (GCE) in order to synchronise a large amount of data from Salesforce to BigQuery. The first approach was to use the Salesforce source and BigQuery destination, however the synchronisation process took 8 days (for a single table) before crashing due to a 400 Bad Request to the Salesforce API. Our issues: • Is there a way to ensure (or check) Airbyte will start the next sync where the last crashed ? • When using other destinations than BigQuery (ie: GCS with parquet files), it seems that Airbyte workers try to load really big chunks of data (currently > 10GB, and it's still increasing) before loading them to the destination (GCS)...Is there a way to customize this behavior to a more stream-like or mini-batch behavior ?
    • 1
    • 1
  • m

    Monique Jimenez

    10/27/2022, 1:27 PM
    Hi all 👋 We're running airbyte's open source on k8s and I was wondering if there are any plans to allow a lookback window in place of a hardcoded start date when setting up a gitlab connector. It looks to be available for the slack connector already which is nice with keeping unnecessary data transfer down.
    s
    • 2
    • 1
  • a

    Anish Giri

    10/27/2022, 2:48 PM
    Hey everyone, I'm new to Airbyte and just spun up a local instance. We work with multiple customers and would love to group all connectors associated with each customer. Not only will this help have a cleaner setup, but will also mean that the destination can be set for a group of connectors depending on the customer they belong to. What's the best way to do support this type of multi-tenancy?
    r
    m
    • 3
    • 4
  • m

    Marielby Soares

    10/27/2022, 3:11 PM
    Hello everyone, is there a way to setup a connection for once source and multiple destinations? Let's say I need to put same data in MySql and also in Kafka
    m
    • 2
    • 2
  • l

    Lucas Gonthier

    10/27/2022, 3:28 PM
    Hi all, I would like to know if there is a reason to prefer octavia-cli instead of the API ? I don't see usecase where it is better to use octavia
    m
    s
    e
    • 4
    • 5
  • g

    Grant Pendrey

    10/27/2022, 3:45 PM
    I am prototyping a project using airbyte to replicate data from MSSQL --> Bigquery. Running airbyte open source on a GCP VM, with an external Google Cloud SQL Postgres database. On every sync, the first attempt fails but the second attempt succeeds. From what I can see in the logs, the issue on the first attempt creates the following error message:
    Copy code
    2022-10-27 14:40:35 INFO i.a.w.g.DefaultNormalizationWorker(run):95 - Normalization summary: io.airbyte.config.NormalizationSummary@6d415f1a[startTime=1666881618122,endTime=1666881635122,failures=[io.airbyte.config.FailureReason@59d118f6[failureOrigin=normalization,failureType=system_error,internalMessage=Model 'model.airbyte_utils.custom_reports_stream' (models/generated/airbyte_tables/airbyte/custom_reports_stream.sql) depends on a source named 'airbyte._airbyte_raw_custom_reports_stream' which was not found,externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@3c78dc3b[additionalProperties={attemptNumber=0, jobId=30, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    Encountered an error:
    Compilation Error in model custom_reports_stream (models/generated/airbyte_tables/airbyte/custom_reports_stream.sql)
      Model 'model.airbyte_utils.custom_reports_stream' (models/generated/airbyte_tables/airbyte/custom_reports_stream.sql) depends on a source named 'airbyte._airbyte_raw_custom_reports_stream' which was not found,retryable=<null>,timestamp=1666881635122], io.airbyte.config.FailureReason@744c05a7[failureOrigin=normalization,failureType=system_error,internalMessage=Model 'model.airbyte_utils.custom_reports_stream' (models/generated/airbyte_tables/airbyte/custom_reports_stream.sql) depends on a source named 'airbyte._airbyte_raw_custom_reports_stream' which was not found,externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@13123dbb[additionalProperties={attemptNumber=0, jobId=30, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    Encountered an error:
    Compilation Error in model custom_reports_stream (models/generated/airbyte_tables/airbyte/custom_reports_stream.sql)
      Model 'model.airbyte_utils.custom_reports_stream' (models/generated/airbyte_tables/airbyte/custom_reports_stream.sql) depends on a source named 'airbyte._airbyte_raw_custom_reports_stream' which was not found,retryable=<null>,timestamp=1666881635122]]]
    1. Can someone help me understand this error message? 2. Also, since it works every time on the second attempt… is the second attempt doing something different that allows it to succeed? Maybe it’s simply skipping the normalization process, or something like that?
    s
    • 2
    • 1
  • d

    DR

    10/27/2022, 4:15 PM
    I am trying to connect Facebook Marketing source. But It throws the following error. Can anyone help me how to resolve this error?
    Copy code
    FacebookAPIException('Error:2635, (#2635) You are calling a deprecated version of Ads API. Please update to the latest version: V15.0.')
    • 1
    • 1
  • g

    Giovani Freitas

    10/27/2022, 4:45 PM
    Hi guys! Beginner's question: I configured the hubspot source and put "2022-09-26T000000Z" as the starting date, but after the sync was over, I went to look at the files and found records whose created_at field was from 2021! As I understand it, only records created from the start date would be synced, right?
    m
    • 2
    • 11
  • t

    Thomas Pedot

    10/27/2022, 5:35 PM
    Hello everyone ! I check a new connector made. And it didn't work not because of the Connector itself but because of the version of destination. I will raise an issue but I wonder if you plan (if not already to) to make kind of compatibility matrix ? For exemple, it was updated because it was not working anymore. No it works but require an update of the target.
    s
    • 2
    • 4
  • f

    Francisco Viera

    10/27/2022, 6:47 PM
    how could extract my project dbt on gke ?
    🙏 1
    h
    • 2
    • 1
  • s

    Slackbot

    10/27/2022, 7:59 PM
    This message was deleted.
    d
    h
    f
    • 4
    • 9
  • v

    Venkat Dasari

    10/27/2022, 8:29 PM
    Folks, Airbyte on my EC2 is down. where can i get the logs? how can i restart it? Will it loose all the state? please help
    e
    s
    • 3
    • 4
  • j

    José Torero

    10/27/2022, 8:49 PM
    Hi teams, Please, i need help for connect with mysql
    s
    • 2
    • 1
  • j

    José Torero

    10/27/2022, 8:49 PM
    image.png
    c
    h
    • 3
    • 4
  • c

    Christopher Teljstedt

    10/27/2022, 9:30 PM
    Hi I am trying to fetch schemas from MSSQL 0.4.24 connection which is Azure Synapse and get the following message taken from the logs:
    Copy code
    2022-10-27 21:25:11 [32mINFO[m i.a.p.j.e.LoggingJobErrorReportingClient(reportJobFailureReason):23 - Report Job Error -> workspaceId: 28f8bfd5-ac55-4717-812f-1152ebf0eed2, dockerImage: airbyte/source-mssql:0.4.24, failureReason: io.airbyte.config.FailureReason@269e6540[failureOrigin=<null>,failureType=system_error,internalMessage=com.microsoft.sqlserver.jdbc.SQLServerException: Some part of your SQL statement is nested too deeply. Rewrite the query or break it up into smaller queries.,externalMessage=Something went wrong in the connector. See the logs for more details.,metadata=io.airbyte.config.Metadata@5c3bff1d[additionalProperties={attemptNumber=null, jobId=null, from_trace_message=true, connector_command=discover}],stacktrace=com.microsoft.sqlserver.jdbc.SQLServerException: Some part of your SQL statement is nested too deeply. Rewrite the query or break it up into smaller queries.
    s
    • 2
    • 2
  • t

    Thomas C

    10/27/2022, 10:04 PM
    Does Airbyte (Cloud) support Snowflake (source) ->Snowflake (destination)? I’ve tried a changing the source to be different databases/schemas/accounts, but it does not appear to be able to list any streams to sync on the Replication page of setting up a connection. I’m trying it out as a way to do a quick way to deep-copy a customer’s data. We currently use data shares and then tasks/workflows to deep copy, but are looking to generalize it for additional source warehouses (BigQuery/Redshift/etc).
    • 1
    • 3
  • c

    Chris Weis

    10/27/2022, 11:41 PM
    Feedback on this Community (because I don't see any better channel for feedback): I see this #C03909JGQKV (and probably other) channel(s) promoted on the Community page but looks like it was archived. Somebody should probably remove it from the Community page.
    m
    • 2
    • 1
  • r

    Rishabh D

    10/28/2022, 6:40 AM
    Hey team, I was trying to refresh the source schema while pulling Salesforce data but there is a data reset happening every time . Is there a way to avoid this as the re-sync takes much longer than expected ?
    e
    s
    • 3
    • 5
  • z

    Zaza Javakhishvili

    10/28/2022, 7:43 AM
    Hi, Please someone additionally look/review my Amazon Seller Partner's new reports pull request to merge it ASAP. https://github.com/airbytehq/airbyte/pull/18283
    • 1
    • 1
  • h

    Home Kralych

    10/28/2022, 8:14 AM
    Hi everyone. Is there an option on Airbyte for a conditional sync? Say, I want to sync only certain rows that match an expression
    m
    • 2
    • 1
  • d

    Duck Psy

    10/28/2022, 9:30 AM
    Hi team, can i use MySQL ( instead of Postgresql ) is Database of Airbyte ?
    s
    • 2
    • 1
  • l

    laila ribke

    10/28/2022, 10:51 AM
    Hi all, I need to run server:/tmp/workspace/${NORMALIZE_WORKSPACE}/build/run/airbyte_utils/models/generated/ models/ from dbt cloud, for transforming the data in airbyte through dbt. How should I do it?
    s
    • 2
    • 1
  • h

    Home Kralych

    10/28/2022, 10:58 AM
    I am syncing a table using the “Incremental | Deduped History approach” using Basic Normalization. The final deduped table is partitioned by _airbyte_emitted_at. Is there a way to partition it by a timestamp available in the source table instead of this field?
    c
    • 2
    • 2
  • r

    Rahul Borse

    10/28/2022, 12:42 PM
    Hi Team, Hubspot api key will be removed from next month as per the below document. So when can we expect the same changes in airbyte oss for connecting to hubspot source? https://developers.hubspot.com/changelog/upcoming-api-key-sunset
    h
    t
    • 3
    • 4
  • l

    laila ribke

    10/28/2022, 12:51 PM
    I´m lost. I need to set a dbt transformation to one of my connections. Is it only through docker?
    • 1
    • 1
1...838485...245Latest