https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • l

    Luke Pammant

    12/14/2022, 11:29 PM
    Hey everyone. Has anyone managed to setup the Google Analytics Connector w/ OAuth? It doesn't make sense to me that it requires a Refresh Token. Shouldn't it just need a ClientID/Secret and then the Access Token and Refresh Token come once a user authenticates?
    e
    o
    s
    • 4
    • 8
  • l

    Leo Schick

    12/15/2022, 9:36 AM
    Hello, I am a new user to airbyte. I was using singer taps before and build a custom tap for AWIN Affiliate Network (https://github.com/Horze-International/tap-awin-advertiser) How can I get this tap easily running with Airbyte?
  • l

    Leo Schick

    12/15/2022, 9:36 AM
    I saw that before Airbyte supported native Singer taps. Is this still possible or are there other ways to do this nowadays?
    n
    m
    • 3
    • 5
  • s

    Shashank Tiwari

    12/15/2022, 10:42 AM
    can anyone help me with this issue
    i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: Bean definition [javax.sql.DataSource] could not be loaded: Error instantiating bean of type  [javax.sql.DataSource]
    . This happens while deploying airbyte thorough Kustomize. This is worker pod
    s
    j
    +4
    • 7
    • 7
  • j

    Josh Jeffries

    12/15/2022, 10:42 AM
    Hi Everyone, is it possible to do the oauth 2 authorization code flow via the low code yaml builder? I can see that it set's up the token and refresh, but not sure what/where i should be putting the auth endpoint. was thinking it might be something like token_authorization_endpoint: "url"
  • a

    Antonio Preučil

    12/15/2022, 10:43 AM
    Hi Everyone I'm starting to work with Airbyte but I have some questions about data sources. Is there any way to define custom REST api data source. For example I have couple of custom REST apis from where I need to fetch data. On Airbyte sources I only see already defined services that can be defined as sources?
    j
    • 2
    • 3
  • a

    Alper Uzun

    12/15/2022, 12:30 PM
    Hello everyone, I have a project related to the development of an existing system and airbyte is one of the part of it. I opened an issue in "discuss.airbyte.io" forum. Is there anyone who can help me? https://discuss.airbyte.io/t/airbyte-server-connection-with-airbyte-temporal-seems-to-fail-with-docker-compose-up-centos-7/3467
    • 1
    • 1
  • s

    Shashank Tiwari

    12/15/2022, 12:43 PM
    Hello everyone, i was deploying Airbyte using kustomization.yaml. When deployed with default values there was no issue on EKS cluster but as soon as we change the log to s3 and use external DB (exactly as described in the docs) the worker and cron job pod failed with the error
    i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: Bean definition [javax.sql.DataSource] could not be loaded: Error instantiating bean of type  [javax.sql.DataSource]
    . Is there something else we need to do in order to deploy with custom s3 bucket and DB?
    u
    h
    u
    • 4
    • 4
  • s

    Sheshan

    12/15/2022, 1:22 PM
    Hello All, Anyone seen such error in logs after download is "SUCCEEDED".
    Copy code
    2022-12-15 13:17:28 WARN i.t.i.w.ActivityWorker$TaskHandlerImpl(logExceptionDuringResultReporting):365 - Failure during reporting of activity result to the server. ActivityId = 572543e4-4b6a-3c21-87d3-f9986f3656c5, ActivityType = JobSuccessWithAttemptNumber, WorkflowId=connection_manager_457b6ad6-4826-4fb1-93c8-d93b8a4687d0, WorkflowType=ConnectionManagerWorkflow, RunId=823444ac-dc24-4e64-8dd5-f0b2fd32ee35
    io.grpc.StatusRuntimeException: NOT_FOUND: invalid activityID or activity already timed out or invoking workflow is completed
    
    2022-12-15 13:17:37 WARN i.t.i.a.ActivityTaskExecutors$BaseActivityTaskExecutor(execute):114 - Activity failure. ActivityId=572543e4-4b6a-3c21-87d3-f9986f3656c5, activityType=JobSuccessWithAttemptNumber, attempt=2
    java.lang.IllegalStateException: Transitioning Job 14 from JobStatus SUCCEEDED to SUCCEEDED is not allowed. The only valid statuses that an be transitioned to from SUCCEEDED are []
    This is preventing me to triggerin the sync again. Trigger api gives below error:
    Copy code
    {'message': 'A sync is already running for: 836a4c9a-7c07-40dc-b253-1937026c0db9', 'exceptionClassName': 'io.airbyte.server.errors.ValueConflictKnownException', 'exceptionStack': ['io.airbyte.server.errors.ValueConflictKnownException: A sync is already running for: 836a4c9a-7c07-40dc-b253-1937026c0db9',}
    n
    • 2
    • 8
  • s

    Sebastian Brickel

    12/15/2022, 2:07 PM
    Hey Team, My mailchimp connector keeps failing after trying to sync for 24h. Something went wrong during replication. Can someone have a look and help me please? I have attached the log file
    22b8b65b_bd02_45d5_ba52_32865c1fba9e_logs_2132_txt
    m
    s
    • 3
    • 16
  • k

    Krzysztof Wabia

    12/15/2022, 2:46 PM
    Hello everyone, My company is using open-source Airbyte on installed on Kubernetes. Is it possible to connect to MySQL database of another company that restrict access to VPN only? How this connection should be set up?
    m
    j
    • 3
    • 16
  • i

    Ivan Ćirić

    12/15/2022, 3:08 PM
    hi all! I'm trying to deploy airbyte on kubernetes through AWS EKS service following these instructions: https://docs.airbyte.com/deploying-airbyte/on-kubernetes/ it was pretty straight forward to get containers running but I'm stuck with getting the ip/domain/endpoint to connect to airbyte's UI. I have a really simple setup with 1 node group and 2 ec2 nodes. did anyone encounter something similar?
    • 1
    • 1
  • f

    Faris

    12/15/2022, 4:14 PM
    octavia shock I am trying to understand why my table has a different number of rows in the destination. I am doing this experiment with only one table which has 7.5m rows in the source. Replicate mode is
    increment|deduped history
    and in my destination I have 3 tables (normalization enabled): • raw_table: 2.5m rows • table 2.1m rows • table_scd 2.6m rows I expect that scd and raw table should have the same number of rows. the main table will have the deduplicated number of rows. What am I missing on this?
    m
    y
    • 3
    • 4
  • g

    Gabriel Barbutti

    12/15/2022, 4:32 PM
    Hi everyone, I have a small question about normalization costs in BigQuery when back-filing. I want to get historical data from the last 3 years from Google Analytics, but the cost of the basic normalization would be too big, is there a way to reduce such costs? Thank you!
    m
    u
    • 3
    • 6
  • j

    João Larrosa

    12/15/2022, 6:08 PM
    Hello! I'm migrating from Fivetran to Airbyte to ingest data from Shopify to BigQuery. After implemented, I noticed that Airbyte is not bringing all data that Fivetran brings. The inventory_items, product_variant and order_line tables seems to come incomplete. For example, the Fivetran's inventory_items table has 183 lines, while Airbyte's has 11. May somebody help me out with it please? Thank you very much.
    n
    m
    • 3
    • 25
  • a

    anni

    12/15/2022, 6:40 PM
    Hi all, I need help with the following three questions/issues. 1. [solved] when replicating
    campaigns
    data from Google Ads to postgres, the sync failed during normalization, it seems like a data type error
    Copy code
    2022-11-25 20:46:06 normalization > Found 9 models, 0 tests, 0 snapshots, 0 analyses, 579 macros, 0 operations, 0 seed files, 2 sources, 0 exposures, 0 metrics
    2022-11-25 20:46:06 normalization > Concurrency: 8 threads (target='prod')
    2022-11-25 20:46:06 normalization > 1 of 4 START table model google_ads_raw.campaign_labels................................................................. [RUN]
    2022-11-25 20:46:06 normalization > 2 of 4 START incremental model _airbyte_google_ads_raw.campaigns_stg.................................................... [RUN]
    2022-11-25 20:46:06 normalization > 1 of 4 OK created table model google_ads_raw.campaign_labels............................................................ [SELECT 0 in 0.58s]
    2022-11-25 20:46:06 normalization > 20:46:03 + "beamdatadw"._airbyte_google_ads_raw."campaigns_stg"._airbyte_ab_id does not exist yet. The table will be created or rebuilt with dbt.full_refresh
    2022-11-25 20:46:06 normalization > 2 of 4 ERROR creating incremental model _airbyte_google_ads_raw.campaigns_stg........................................... [ERROR in 1.09s]
    2022-11-25 20:46:06 normalization > 3 of 4 SKIP relation google_ads_raw.campaigns_scd....................................................................... [SKIP]
    2022-11-25 20:46:06 normalization > 4 of 4 SKIP relation google_ads_raw.campaigns........................................................................... [SKIP]
    2022-11-25 20:46:06 normalization > Finished running 1 table model, 3 incremental models in 1.76s.
    2022-11-25 20:46:06 normalization > Completed with 1 error and 0 warnings:
    2022-11-25 20:46:06 normalization > Database Error in model campaigns_stg (models/generated/airbyte_incremental/google_ads_raw/campaigns_stg.sql)
    2022-11-25 20:46:06 normalization >   invalid input syntax for type bigint: "0.0"
    2022-11-25 20:46:06 normalization >   compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_incremental/google_ads_raw/campaigns_stg.sql
    2022-11-25 20:46:06 normalization > Done. PASS=1 WARN=0 ERROR=1 SKIP=2 TOTAL=4
    2022-11-25 20:46:06 INFO i.a.w.g.DefaultNormalizationWorker(run):82 - Normalization executed in 14 seconds.
    2022-11-25 20:46:06 ERROR i.a.w.g.DefaultNormalizationWorker(run):90 - Normalization Failed.
    2. [unsolved] when replicating data from HubSpot to postgres, the
    forms
    and
    form_submissions
    tables are not synced. The most recent data are from 2021-11-10 051426.000000. 3. [unsolved] when replicating data from HubSpot to postgres, I didn’t find the
    call outcome
    attribute in the engagement related tables. Thanks.
    m
    u
    • 3
    • 18
  • k

    Krisjan Oldekamp

    12/15/2022, 6:55 PM
    Anyone deployed Airbyte / helm on GKE using Pulumi? I succeeded to set up the cluster and deploy the helm chart, but I can't seem to access the UI (having trouble configuring ingress etc). Looking for some examples. Thanks!
    n
    u
    +2
    • 5
    • 7
  • l

    Luke Pammant

    12/15/2022, 7:11 PM
    When using DBT transformations w/ AirByte do I need a dedicated git repo for DBT or can I use a DBT project inside of my monorepo?
    m
    m
    • 3
    • 6
  • p

    Patryk Kalinowski

    12/15/2022, 9:31 PM
    Hey, I'm trying to setup Airbyte on mac M1 Ventura 13.0 Both general setup and M1 specific one don't work. After setting up env variables and running
    VERSION=dev docker-compose up
    (these instructions) I'm getting:
    Copy code
    Error response from daemon: manifest for airbyte/connector-builder-server:dev not found: manifest unknown: manifest unknown
    Regular
    docker-compose up
    ends up with some Postgres errors I believe. Attaching logs.
    Copy code
    2022-12-15 21:20:26 ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: Bean definition [javax.sql.DataSource] could not be loaded: Error instantiating bean of type  [javax.sql.DataSource]
    airbyte-cron                      | 
    airbyte-cron                      | Message: Driver org.postgresql.Driver claims to not accept jdbcUrl, <postgres://ckzlpjpqnwcbuv:622bd9e62eb1c835157e7fde72ab4f0c0a88e8d2b6659b4b0add166d81c51c17@ec2-54-158-247-210.compute-1.amazonaws.com:5432/d6vo5qg73t0t8l>
    airbyte-log.txt
    n
    • 2
    • 7
  • t

    Tiago ST

    12/15/2022, 9:52 PM
    Is there a hard limit for a table size of 2 GB?
    Copy code
    2022-12-15 21:28:03 [44msource[0m > Reading stream lis. Records read: 12700000
    2022-12-15 21:28:03 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):321 - Records read: 12711000 (2 GB)
    2022-12-15 21:28:04 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):321 - Records read: 12712000 (2 GB)
    2022-12-15 21:28:04 [43mdestination[0m > 2022-12-15 21:28:04 [32mINFO[m i.a.i.d.r.InMemoryRecordBufferingStrategy(lambda$flushAll$1):86 - Flushing lis: 31508 records (24 MB)
    2022-12-15 21:28:04 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):321 - Records read: 12713000 (2 GB)
    2022-12-15 21:28:04 [32mINFO[m i.a.w.g.DefaultReplicationWorker(lambda$readFromSrcAndWriteToDstRunnable$6):321 - Records read: 12714000 (2 GB)
    2022-12-15 21:28:18 [43mdestination[0m > 2022-12-15 21:28:18 [1;31mERROR[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(accept):52 - Exception while accepting message
    2022-12-15 21:28:18 [43mdestination[0m > java.lang.RuntimeException: java.sql.SQLException: The table '_airbyte_tmp_ppx_lis' is full
  • t

    Tiago ST

    12/15/2022, 9:53 PM
    the destination part of the process also seems to fail, not sure what to do here:
    Copy code
    [33mDatabase Error in model api_limits (models/generated/airbyte_tables/exchange_sink/api_limits.sql)[0m
      1146 (42S02): Table 'exchange_sink._airbyte_raw_api_limits' doesn't exist
      compiled SQL at ../build/run/airbyte_utils/models/generated/airbyte_tables/exchange_sink/api_limits.sql
    m
    • 2
    • 4
  • t

    Tiago ST

    12/15/2022, 9:53 PM
    shouldn’t it create the table it needs? looks like a tmp table of sorts too
  • l

    Lucas Souza Lira Silva

    12/15/2022, 11:23 PM
    Is there connector in Airbyte for extract data from Google Analytics to ADLS gen2?
    m
    • 2
    • 2
  • s

    Sam Stoelinga

    12/15/2022, 11:39 PM
    Seems my local airbyte install got broken. I do see this in the logs:
    Copy code
    airbyte-server                    | 2022-12-15 23:37:57 ERROR i.a.s.e.UncaughtExceptionMapper(toResponse):22 - Uncaught
     exception                                                                                                             
    airbyte-server                    | java.lang.IllegalStateException: Duplicate key dbd64161-eb36-4d70-b462-62051452b2dc
     (attempted merging values io.airbyte.config.ActorCatalogFetchEvent@54a99403[id=<null>,actorId=dbd64161-eb36-4d70-b462-
    62051452b2dc,actorCatalogId=5c8e4ebd-f25e-4ba2-84ff-e85b1b80e279,configHash=<null>,connectorVersion=<null>,createdAt=16
    71131395] and io.airbyte.config.ActorCatalogFetchEvent@23daf9bb[id=<null>,actorId=dbd64161-eb36-4d70-b462-62051452b2dc,
    actorCatalogId=48f51c6a-56e3-4d1f-8e8f-ebadd5817ba3,configHash=<null>,connectorVersion=<null>,createdAt=1671131395])
    last time I had to force delete all data to receover from this. Any other way to fix it?
    n
    u
    • 3
    • 9
  • b

    Brian de la Motte

    12/16/2022, 12:19 AM
    Trying to install Airbyte on Kubernetes EKS with helm and I'm getting a timeout, so I increased the timeout but it seems to still be stuck. The db and minio pods seem to be stuck in pending mode. Any ideas?
    Copy code
    airbyte % kubectl get pods    
    NAME                             READY   STATUS    RESTARTS   AGE
    0.40.25-airbyte-bootloader       1/1     Running   0          57m
    0.43.6-airbyte-bootloader        1/1     Running   0          33m
    airbyte-db-0                     0/1     Pending   0          17m
    airbyte-minio-0                  0/1     Pending   0          17m
    hmd-airbyte-airbyte-bootloader   1/1     Running   0          17m
    m
    k
    • 3
    • 11
  • r

    Rahadian Djati

    12/16/2022, 7:51 AM
    hi everyone, I have a postgres db in production using the json data type, can airbyte mapping json to bigquery as json not as a string?
    e
    м
    • 3
    • 5
  • p

    Philip Corr

    12/16/2022, 10:31 AM
    Hello, I've been posting here when I have a PR to add new streams. Should I keep doing this? Just want to make sure i'm not annoying people here etc. It would be good to know if there is a process for bringing PRs to airbyte's attention once they are ready? Latest PR is here: https://github.com/airbytehq/airbyte/pull/20518 Thanks!
    n
    • 2
    • 5
  • v

    Valentyn Solonechnyi

    12/16/2022, 12:40 PM
    Hi there, FYI the docs are seem to be down https://docs.airbyte.com/
    a
    • 2
    • 1
  • k

    Krzysztof

    12/16/2022, 12:54 PM
    Hi guys, cannot figure it out from sources, which base image you guys are using to build connectors??
    m
    • 2
    • 1
  • a

    Anurag Jain

    12/16/2022, 1:15 PM
    How can i connect to airbyte which is running on localhost of aws ec2 using public/private ip of ec2 from a different system?
    m
    • 2
    • 1
1...110111112...245Latest