https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Andrzej Lewandowski

    11/03/2022, 9:34 AM
    Hi. We’ve already deployed ec2 instance (t2.large instance as is recommended) and run airbyte using docker-compose, but initial sync is very slow, and cpu usage is almost 100%. Is t2.large really good choice? Do you have some docs about performance or how speed up initial sync?
    h
    • 2
    • 2
  • l

    laila ribke

    11/03/2022, 10:49 AM
    Hi all, I have a google ads -> Redshift connection with 7 Streams. I´ve set an Incremental sync mode (deduped + history), with a sync every 24 hours. I see the Redshift unblended cost is 450€ per day!!!!, which is impossible. Can you set a meeting with me to see what is the best practice on working with Redshift destination? Because as you can see in the logs below, it doesn´t load the data in one batch but runs every 5 seconds.. I cancelled it when we realized it writes on Redshift every 5 seconds. This is what the engineer sent me There is a query that run every 5 second(!!!!!!) that is the main cause of the mess we see: INSERT INTO indiana._airbyte_tmp_zyc_indiana_clickout ( _airbyte_ab_id, _airbyte_data, _airbyte_emitted_at ) VALUES ($ 1, JSON_PARSE($ 2), $ 3), ($ 4, JSON_PARSE($ 5), $ 6), ($ 7, JSON_PARSE($ 8), $ 9), ($ 10, JSON_PARSE($ 11), $ 12), ($ 13, JSON_PARSE($ 14), $ 15), ($ 16, JSON_PARSE($ 17), $ 18), ($ 19, JSON_PARSE($ 20), $ 21), ($ 22, JSON_PARSE($ 23), $ 24), ($ 25, JSON_PARSE($ 26), $ 27), ($ 28, JSON_PARSE($ 29), $ 30), ($ 31, JSON_PARSE($ 32), $ 33), ($ 34, JSON_PARSE($ 35), $ 36), ($ 37, JSON_PARSE($ 38), $ 39), ($ 40, JSON_PARSE($ 41), $ 42), ($ 43, JSON_PARSE($ 44), $ 45), ($ 46, JSON_PARSE($ 47), $ 48), ($ 49, JSON_PARSE($ 50), $ 51), ($ 52, JSON_PARSE($ 53), $ 54), ($ 55, JSON_PARSE($ 56), $ 57), ($ 58, JSON_PARSE($ 59), $ 60), ($ 61, JSON_PARSE($ 62), $ 63), ($ 64, JSON_PARSE($ 65), $ 66), ($ 67, JSON_PARSE($ 68), $ 69), ($ 70, JSON_PARSE($ 71), $ 72), ($ 73, JSON_PARSE($ 74), $ 75), ($ 76, JSON_PARSE($ 77), $ 78), ($ 79, JSON_PARSE($ 80), $ 81), ($ 82, JSON_PARSE($ 83), $ 84), ($ 85, JSON_PARSE($ 86), $ 87), ($ 88, JSON_PARSE($ 89), $ 90), ($ 91, JSON_PARSE($ 92), $ 93), ($ 94, JSON_PARSE($ 95), $ 96), ($ 97, JSON_PARSE($ 98), $ 99), ($ 100, JSON_PARSE($ 101), $ 102), ($ 103, JSON_PARSE($ 104), $ 105), ($ 106, JSON_PARSE($ 107), $ 108), ($ 109, JSON_PARSE($ 110), $ 111), ($ 112, JSON_PARSE($ 113), $ 114), ($ 115, JSON_PARSE($ 116), $ 117), ($ 118, JSON_PARSE($ 119), $ 120), ($ 121, JSON_PARSE($ 122), $ 123), ($ 124, JSON_PARSE($ 125), $ 126), ($ 127, JSON_PARSE($ 128), $ 129), ($ 130, JSON_PARSE($ 131), $ 132), ($ 133, JSON_PARSE($ 134), $ 135), ($ 136, JSON_PARSE($ 137), $ 138), ($ 139, JSON_PARSE($ 140), $ 141), ($ 142, JSON_PARSE($ 143), $ 144), ($ 145, JSON_PARSE($ 146), $ 147), ($ 148, JSON_PARSE($ 149), $ 150), ($ 151, JSON_PARSE($ 152), $ 153), ($ 154, JSON_PARSE($ 155), $ 156), ($ 157, JSON_PARSE($ 158), $ 159), ($ 160, JSON_PARSE($ 161), $ 162), ($ 163, JSON_PARSE($ 164), $ 165), ($ 166, JSON_PARSE($ 167), $ 168), ($ 169, JSON_PARSE($ 170), $ 171), ($ 172, JSON_PARSE($ 173), $ 174), ($ 175, JSON_PARSE($ 176), $ 177), ($ 178, JSON_PARSE($ 179), $ 180), ($ 181, JSON_PARSE($ 182), $ 183), ($ 184, JSON_PARSE($ 185), $ 186), ($ 187, JSON_PARSE($ 188), $ 189), ($ 190, JSON_PARSE($ 191), $ 192), ($ 193, JSON_PARSE($ 194), $ 195), ($ 196, JSON_PARSE($ 197), $ 198), ($ 199, JSON_PARSE($ 200), $ 201), ($ 202, JSON_PARSE($ 203), $ 204), ($ 205, JSON_PARSE($ 206), $ 207), ($ 208, JSON_PARSE($ 209), $ 210), ($ 211, JSON_PARSE($ 212), $ 213), ($ 214, JSON_PARSE($ 215), $ 216), ($ 217, JSON_PARSE($ 218), $ 219), ($ 220, JSON_PARSE($ 221), $ 222), ($ 223, JSON_PARSE($ 224), $ 225), ($ 226, JSON_PARSE($ 227), $ 228), ($ 229, JSON_PARSE($ 230), $ 231), ($ 232, JSON_PARSE($ 233), $ 234), ($ 235, JSON_PARSE($ 236), $ 237), ($ 238, JSON_PARSE($ 239), $ 240), ($ 241, JSON_PARSE($ 242), $ 243), ($ 244, JSON_PARSE($ 245), $ 246), ($ 247, JSON_PARSE($ 248), $ 249), ($ 250, JSON_PARSE($ 251), $ 252), ($ 253, JSON_PARSE($ 254), $ 255), ($ 256, JSON_PARSE($ 257), $ 258), ($ 259, JSON_PARSE($ 260), $ 261), ($ 262, JSON_PARSE($ 263), $ 264), ($ 265, JSON_PARSE($ 266), $ 267), ($ 268, JSON_PARSE($ 269), $ 270), ($ 271, JSON_PARSE($ 272), $ 273), ($ 274, JSON_PARSE($ 275), $ 276), ($ 277, JSON_PARSE($ 278), $ 279), ($ 280, JSON_PARSE($ 281), $ 282), ($ 283, JSON_PARSE($ 284), $ 285), ($ 286, JSON_PARSE($ 287), $ 288), ($ 289, JSON_PARSE($ 290), $ 291), ($ 292, JSON_PARSE($ 293), $ 294), ($ 295, JSON_PARSE($ 296), $ 297), ($ 298, JSON_PARSE($ 299), $ 300), ($ 301, JSON_PARSE($ 302), $ 303), ($ 304, JSON_PARSE($ 305), $ 306), ($ 307, JSON_PARSE($ 308), $ 309), ($ 310, JSON_PARSE($ 311), $ 312), ($ 313, JSON_PARSE($ 314), $ 315), ($ 316, JSON_PARSE($ 317), $ 318), ($ 319, JSON_PARSE($ 320), $ 321), ($ 322, JSON_PARSE($ 323), $ 324), ($ 325, JSON_PARSE($ 326), $ 327), ($ 328, JSON_PARSE($ 329), $ 330), ($ 331, JSON_PARSE($ 332), $ 333), ($ 334, JSON_PARSE($ 335), $ 336), ($ 337, JSON_PARSE($ 338), $ 339), ($ 340, JSON_PARSE($ 341), $ 342), ($ 343, JSON_PARSE($ 344), $ 345), ($ 346, JSON_PARSE($ 347), $ 348), ($ 349, JSON_PARSE($ 350), $ 351), ($ 352, JSON_PARSE($ 353), $ 354), ($ 355, JSON_PARSE($ 356), $ 357), ($ 358, JSON_PARSE($ 359), $ 360), ($ 361, JSON_PARSE($ 362), $ 363), ($ 364, JSON_PARSE($ 365), $ 366), ($ 367, JSON_PARSE($ 368), $ 369), ($ 370, JSON_PARSE($ 371), $ 372), ( $ 373, JSON_PARSE($ 374), $ What is the best practice for using Redshift as a destination? attached the logs and a screenshot of Redshift cost
    9d9aea6f_8f2e_4ef1_9c3e_0bbf57935780_logs_236_txt.txt
    n
    • 2
    • 5
  • r

    Rahul Borse

    11/03/2022, 12:16 PM
    Hi All, Can someone help me out which class/file is responsible to create csv file in destination in airbyte code. I just want to understand how airbyte is doing it.
    s
    • 2
    • 11
  • d

    Dany Chepenko

    11/03/2022, 12:46 PM
    Is anyone using exchange_rates API in their setup? After upgrading to 0.40.17 started receiving the error on the connection:
    Copy code
    2022-11-03 12:39:00 ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):113 - Check failed
    • 1
    • 1
  • s

    Shivam Kapoor

    11/03/2022, 12:59 PM
    My source-lever hiring connector is reading data at 1000 records every 5 mins. Is there anyway to speed this up ?
    s
    a
    • 3
    • 10
  • d

    Dudu Vaanunu

    11/03/2022, 1:22 PM
    Hi all, We’ve been struggling with a pretty simple issue that doesn’t seem to have a clear solution in Airbyte, Initial loading of big tables from a RDS (PG or mySQL) to a DWH (Snowflake). Airbyte just can’t overcome the initial load of a big table and fails over and over again. I thought about getting the initial load by exporting it into a file and loading it into Snowflake but: 1. I’m not sure I can create the target table for Airbyte. 2. Target table usually contains internal AB id’s I can’t generate. 3. I’m pretty sure Airbyte won’t recognize the increment and we’ll try to load the entire data set again. How did you overcome that issue? Any creative workaround would be gladly accepted. Thanks!
    n
    • 2
    • 4
  • m

    Monika Bednarz

    11/03/2022, 1:24 PM
    Hi Team 👋 I’m trying to set up a Google Ads integration - migrating it from another replication tool. The previous tool didn’t require the developer token though - integration just needed to be authorized through the manager account. In Airbyte’s doc I see that it is required. Is there a way to set it up with only authorization and using the Basic plan? (We host Airbyte internally, not using Airbyte Cloud)
    h
    • 2
    • 6
  • p

    Pierre Kerschgens

    11/03/2022, 1:40 PM
    Hey everybody, I have a question regarding the data types of the parquet format in the S3 destination. As source I want to sync data from a PostgreSQL database. The PostgreSQL DDL looks like this:
    CREATE
    *TABLE* public.task_events (
    id
    serial4
    NOT
    *NULL*,
    task_id
    int4
    *NULL*,
    event_name
    *varchar*(20) *NULL*,
    ts
    timestamp
    NOT
    *NULL*,
    *CONSTRAINT* task_events_pkey *PRIMARY*
    *KEY* (id)
    );
    CREATE
    *INDEX* ix_task_events_task_id *ON* public.task_events *USING* btree (task_id);
    CREATE
    *INDEX* ix_task_events_ts *ON* public.task_events *USING* btree (ts);
    When I obtain the jsonschema from Airbyte API I receive these types (looking good to me):
    print(json_schema['properties']['id'])
    {'type': 'number', 'airbyte_type': 'integer'}
    print(json_schema['properties']['ts'])
    {'type': 'string', 'format': 'date-time', 'airbyte_type': 'timestamp_without_timezone'}
    print(json_schema['properties']['task_id'])
    {'type': 'number', 'airbyte_type': 'integer'}
    print(json_schema['properties']['event_name'])
    {'type': 'string'}
    Now I check the .parquet files written to S3 by Airbyte in AWS Glue and it has these types:
    id double
    ts struct
    task_id double
    event_name string
    So I guess the S3/Parquet destination converted id and task_id to double instead of int This leads to IDs like this “4.2108168E7” instead of this “42108168" I hope someone can help with this. Thanks in advance! 🙂
    💯 1
    👍 2
    f
    j
    • 3
    • 8
  • d

    dandpz

    11/03/2022, 1:44 PM
    Hi everyone, I am using the Amazon Seller Partner connector, I have issues running the report
    GET_SALES_AND_TRAFFIC_REPORT
    . Using
    Full refresh
    sync mode it fails for timeout, if I set it to
    Incremental append
    it works. According to Amazon documentation reports may take up to a week to populate data after a reporting periods closes, so with cursor field set to
    queryEndDate
    some data are missing. Does anyone happened to face this issue? Thank you in advance for your help 🙂
    s
    • 2
    • 4
  • l

    Louis DAUVOIS

    11/03/2022, 2:03 PM
    Hi everyone, I try to setup salesforce connector but I have a problem on the number of OpportunityLineItems that the integration retrieve. I have 49000 records but in airbytes i get only 14k900 records. For the Account Object it seems to be fine. Someone have an hint ? https://discuss.airbyte.io/t/salesforce-connector-retrieve-only-14k-records-instead-of-49k-records/3094
    s
    • 2
    • 2
  • l

    Leo G

    11/03/2022, 2:14 PM
    HI all
    • 1
    • 1
  • l

    Leo G

    11/03/2022, 2:15 PM
    I'm still having an issue with Snowflake's destination. It gives me a "Non-Json response" error. I can set up Snowflake source without any issues.
    s
    a
    +2
    • 5
    • 48
  • z

    Zaza Javakhishvili

    11/03/2022, 2:46 PM
    Hi guys, Someone can take attention to merge Amazon SP changes? https://github.com/airbytehq/airbyte/pull/18683 https://github.com/airbytehq/airbyte/pull/18283 Slack Conversation
    s
    • 2
    • 3
  • m

    Manitra Ranaivoharison

    11/03/2022, 3:13 PM
    Hello, I am testing version 0.40.18 of Airbyte and since an earlier version, the export of config no longer exists, will this be permanent or are you working on it?
    d
    s
    • 3
    • 3
  • v

    Varun Jain

    11/03/2022, 4:40 PM
    In self-hosted airbyte, how are credentials (e.g. Snowflake password) stored? Can you point me to any places in the codebase
    s
    • 2
    • 4
  • j

    João Larrosa

    11/03/2022, 8:41 PM
    Hi mates! Question: is it possible for Airbyte to deliver in the destination only the first level of normalization, but not the nested tables?
    s
    s
    • 3
    • 2
  • r

    Robert Put

    11/03/2022, 9:03 PM
    anyone know if the stripe source is capped at 1000 records a minute? I can't seem to make it go any faster? On initial sync and on incremental sync after
    • 1
    • 5
  • m

    Marlon Gómez

    11/03/2022, 10:00 PM
    Hello. If I'm building a destination source and I want to create a file inside the local temporal directory (
    /tmp/airbyte_local
    ), what is the path inside the destination container?
    s
    • 2
    • 2
  • b

    Boopaathy r s

    11/03/2022, 10:11 PM
    Hi, I have installed Airbyte on AWS EKS cluster, I’m trying to access the api endpoints from the application. I had attached the alb to the webapp-svc. When trying to access the api, i’m getting
    Error 404
    , but the same is working fine when trying using localhost, I did port-forwarding using
    kubectl port-forward
    . any help to fix this issue is appreciated.
    n
    • 2
    • 3
  • l

    le Minh Nguyen

    11/04/2022, 7:42 AM
    in the linkedin page connector, I get this error while trying to set up:
    Traceback (most recent call last):
    File "/airbyte/integration_code/main.py", line 13, in <module>
    launch(source, sys.argv[1:])
    File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 123, in launch
    for message in source_entrypoint.run(parsed_args):
    File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/entrypoint.py", line 96, in run
    check_config_against_spec_or_exit(connector_config, source_spec)
    File "/usr/local/lib/python3.9/site-packages/airbyte_cdk/sources/utils/schema_helpers.py", line 160, in check_config_against_spec_or_exit
    raise Exception("Config validation error: " + validation_error.message) from None
    Exception: Config validation error: '****' is not of type 'integer'
    I followed every step except naming App name
    airbyte-source
    . not sure if this is related?
    n
    l
    • 3
    • 6
  • s

    Sebastian Brickel

    11/04/2022, 9:08 AM
    Hey all, I am having a hard time to setup Google Search Console. I am on Airbyte 0.40.16 with connector version 0.1.18. I am using an existing service account that I have setup for Google Ads. I grant it “_Domain-wide Delegation_” via the Workspace Admin Console. Then I create a new key and use the service account’s email address together with the new JSON key in the Auth. However I do get the following error message:
    The connection tests failed.
    "Unable to connect to Google Search Console API with the provided credentials - KeyError('siteEntry')"
    I found this thread: Google Search Console - Service Account Key Authentication Failure However it is with regards to version 0.1.16, so hopefully this error was fixed. Anyone who has experience here, who is willing to aid me?
    n
    g
    • 3
    • 6
  • j

    Julien Calderan

    11/04/2022, 10:00 AM
    Hi every one, I'm running Airbyte at my company in order to sync SalesForce (SF) data into BigQuery (BQ). Airbyte was our first choice as it seems to be the simplest, easy to go, and most cost effective solution available on the market in order to implement ELT workflows. However we ran into some troubles synchronizing "large" amounts of data from SF, and we couldn't find out where the issue comes from (SF APIs ? Airbyte SF connector ? Airbyte itself or one of its components such as Temporal ? etc...). Observed Behaviors: • SF -> BQ sync : work perfectly fine with smaller tables • SF(Opportunity) -> BQ : initial sync fails after a few days with the message "job cancelled" (no data committed to BQ) • SF(Opportunity) -> GoogleStorage(parquet) : initial sync failed after a few days with replication error (data committed to GCS in parquet files, but a lot of duplicates as the destination doesn't support DBT normalization and deduped history). • SF(Opportunity, start-date=2022-01-01) -> BQ : initial sync finished successfully this morning (but we only sync a year worth of data). My Question: • Have you any ideas (or well documented patterns 😄 ) on how we could implement batch ELT with Airbyte (ie: sync multiple time-bound batch of data, either in parallel or sequentially). It could be very interesting to be able to sync to the same destination the last month of data, then syncing going back in time month per month (or other time units). • I feel like it would be nice to be able to use Cloud Composer (or other workflow orchestration tools) to programmatically create time-bound connection (From start_date to End_date) to Airbyte, but it seems Airbyte enforce the pattern of 1 time full sync (followed by smaller incremental syncs). Our deployment: • docker-compose deployment on GCP Compute Engine (single VM) • VM: n2-standard-4 ( 4vCPU, 16Gb RAM, 30Gb disk storage) • SF Connector v1.0.23 • BQ Connector v1.2.5 Some informations about our Data: • Table: Opportunity • Row count : 12,5M • SF storage size : 23,9GB (not so much) • Fields : over 500 (300+ user defined fields, + SF hidden system fields).
    s
    • 2
    • 2
  • c

    Charles Verleyen

    11/04/2022, 10:04 AM
    Hi, not sure it is the right channel but couldn't find a channel specific for grouparoo. Just wanted to know what is the roadmap and updates regarding grouparoo. When I go to the github, I can see it has been archived by the owner. What is the new github repo for grouparoo ? We are eagerly looking forward to use grouparoo as it is open-source (as compared to census and hightouch) but we would like to be sure that active development and maintenance is being done since its acquisition by Airbyte. Thank you.
    s
    a
    • 3
    • 4
  • s

    Shanmuga Priyan

    11/04/2022, 11:06 AM
    Hi, We are running airbyte using the compose file provided in the repo. we have setup some connections. But after some days, the workspace volumes took up all the space in our server. It was around 155gb and our server ran out of storage. Is there anyway to limit the workspace size? We haven't made any changes to the compose file.
    s
    • 2
    • 1
  • o

    okasaka

    11/04/2022, 11:17 AM
    Hi, Is File Source Connector bugged? I create new source and set up source, then start checking to access.
    Copy code
    airbyte-worker      | 2022-11-04 10:46:08 INFO i.a.c.i.LineGobbler(voidCall):114 -
    airbyte-worker      | 2022-11-04 10:46:08 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- START CHECK -----
    airbyte-worker      | 2022-11-04 10:46:08 INFO i.a.c.i.LineGobbler(voidCall):114 -
    airbyte-worker      | 2022-11-04 10:46:08 INFO i.a.c.i.LineGobbler(voidCall):114 - Checking if airbyte/source-file:0.2.28 exists...
    airbyte-worker      | 2022-11-04 10:46:08 INFO i.a.c.i.LineGobbler(voidCall):114 - airbyte/source-file:0.2.28 was found locally.
    ・・・
    airbyte-worker      | 2022-11-04 10:46:11 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - Checking access to <gs://source> csv about 200MB :)
    and about 30s later, It gives me a "Non-Json response" error in airbyte's gui. but, not happened error in log. checking access continues.
    Copy code
    ・・・
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:51.313Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/DISCOVER_SCHEMA/1","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:51.313Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/DISCOVER_SCHEMA/1","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:51.342Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/DISCOVER_SCHEMA/3","wf-task-queue-type":"Activity","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:51.343Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/DISCOVER_SCHEMA/3","wf-task-queue-type":"Activity","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:51.390Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/SYNC/3","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:51.390Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/SYNC/3","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:52.350Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/CONNECTION_UPDATER/1","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:52.350Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/CONNECTION_UPDATER/1","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:52.424Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/GET_SPEC/3","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-04T10:47:52.424Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/GET_SPEC/3","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-cron        | 2022-11-04 10:48:01 INFO i.a.c.s.DefinitionsUpdater(updateDefinitions):54 - Connector definitions update disabled.
    airbyte-server      | 2022-11-04 10:48:02 INFO i.a.s.RequestLogger(filter):112 - REQ 172.25.0.6 GET 200 /api/v1/health
    ・・・
    finally, checking access was succeeded.
    Copy code
    airbyte-worker      | 2022-11-04 10:48:43 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):117 - Check succeeded
    airbyte-worker      | 2022-11-04 10:48:43 INFO i.a.w.t.TemporalAttemptExecution(get):162 - Stopping cancellation check scheduling...
    airbyte-worker      | 2022-11-04 10:48:43 INFO i.a.c.i.LineGobbler(voidCall):114 -
    airbyte-worker      | 2022-11-04 10:48:43 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK -----
    airbyte-worker      | 2022-11-04 10:48:43 INFO i.a.c.i.LineGobbler(voidCall):114 -
    but, new source didn't register in Sources because Airbyte probably misunderstood by error ("Non-Json response" error.)
    s
    s
    • 3
    • 8
  • a

    Aazam Thakur

    11/04/2022, 11:28 AM
    Hi team, my pr for the connector contest was being revised by a contributor but wasn't merged despite being under the deadline. Is the contest officially over? Would appreciate to know about it since I was working over it for a week 😅
    n
    • 2
    • 4
  • b

    Berzan Yildiz

    11/04/2022, 11:34 AM
    Is there a way to perform more complex queries with the asana connector? Specifically, I want to get tasks related to a project, not all tasks in my workspace Or any way to pass HTTP query params to a stream?
    s
    • 2
    • 1
  • e

    Espoir Murhabazi

    11/04/2022, 12:03 PM
    Hey Guys.. I anyone willing to help? I have got a connector which is saving data to snowflake data warehouse as my destination. I have updated the source schemas and I added the new columns and refreshed my schemas and it seemed to be working yesterday.
    n
    • 2
    • 1
  • e

    Espoir Murhabazi

    11/04/2022, 12:04 PM
    But now, it is no longer working , I checked the column added on snowflake and they are not there. It is then retunrin the following message.
    Copy code
    Database Error in model AD_UNIT_PER_REFERRER_REPORT_STREAM_SCD (models/generated/airbyte_incremental/scd/GOOGLE_AD_MANAGER/AD_UNIT_PER_REFERRER_REPORT_STREAM_SCD.sql)
      000904 (42000): SQL compilation error: error line 36 at position 31
      invalid identifier 'AD_UNIT_ID'
  • e

    Espoir Murhabazi

    11/04/2022, 12:04 PM
    with ’AD_UNIT_ID” the columns I added which is missing
1...878889...245Latest