https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • k

    Kevin Grismore

    11/01/2022, 9:24 PM
    Hey, I'm a bit confused by the whether the upsert capability in the ElasticSearch destination is usable with my connection. When setting up the destination in the UI, the tooltip on the upsert toggle says:
    Upsert Records
    If a primary key identifier is defined in the source, an upsert will be performed using the primary key value as the elasticsearch doc id. Does not support composite primary keys.
    The source for my connection is BigQuery, which doesn't have primary keys. Instead I construct a surrogate key field called
    id
    . However, I can't select a primary key in the connection because the ElasticSearch destination doesn't support
    Incremental Sync - Deduped History
    .
    s
    • 2
    • 5
  • j

    Jin Gong

    11/01/2022, 10:24 PM
    Hi team, It's been a while since I posted this feedback about zoom integration. Is there any progress on migration away from Singer? It's been a big pain for us. Lately I found Singer doesn't properly handle zoom API pagination but no one is maintaining it. cc @Marcos Marx (Airbyte) as you kindly answered my question last time πŸ™
    m
    • 2
    • 12
  • j

    Juan Felipe GΓ³mez LΓ³pez

    11/01/2022, 10:57 PM
    Hi everyone, I wanted to know if it is possible to user Airbyte as a Python package. We want to use Airbyte to our ingestion pipelines but it is required for us to launch containers running these processes and customize the actual connections
    s
    • 2
    • 1
  • g

    Gary K

    11/02/2022, 5:12 AM
    Hi everyone. Has anyone tried using airbyte to EL data (not necessary doing a normalisation) into an Azure Synapse Analytics dedicated pool? I can get as far as this error when trying to set up the
    MS SQL Server
    connector:
    State code: S0001; Error code: 104467; Message: Enforced unique constraints are not supported. To create an unenforced unique constraint you must include the NOT ENFORCED syntax as part of your statement.
    s
    • 2
    • 2
  • c

    Cain Do

    11/02/2022, 6:05 AM
    I tried running airbyte on windows machine with docker for the first time and unable to get the server started it would give errors:
    irbyte-temporal    | {"level":"info","ts":"2022-11-02T06:06:03.169Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/temporal-sys-add-search-attributes-task-queue/3","wf-task-queue-type":"Workflow","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-02T06:06:03.169Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/temporal-sys-add-search-attributes-task-queue/3","wf-task-queue-type":"Workflow","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-02T06:06:03.170Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/temporal-sys-add-search-attributes-task-queue/3","wf-task-queue-type":"Activity","lifecycle":"Starting","logging-call-at":"taskQueueManager.go:238"}
    airbyte-temporal    | {"level":"info","ts":"2022-11-02T06:06:03.170Z","msg":"none","service":"matching","component":"matching-engine","wf-task-queue-name":"/_sys/temporal-sys-add-search-attributes-task-queue/3","wf-task-queue-type":"Activity","lifecycle":"Started","logging-call-at":"taskQueueManager.go:242"}
    airbyte-cron        | 2022-11-02 06:06:04 WARN i.a.c.t.TemporalUtils(getTemporalClientWhenConnected):245 - Ignoring exception while trying to request Temporal namespace:
    airbyte-cron        | io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 9.999605457s.
    h
    u
    +2
    • 5
    • 7
  • s

    Shivam Kapoor

    11/02/2022, 12:05 PM
    Hi folks, I have built a custom connector but when I import it in Airbyte, I see a lot of sidecar init containers starting like relay-stdout, relay-stderr, remote-stdin, etc which pull images from dockerhub. Now, I want to remove all the dependency on dockerhub because of rate limiting issue. I can push all these images to my own private repo, but how do I change these references in my deployment? I see that these images are defined here. Will I have to redeploy worker ?
    s
    • 2
    • 2
  • l

    laila ribke

    11/02/2022, 1:05 PM
    Hi, I need to set a WISE source and I see it isn't available. Did someone set a wise source via their API and can help me do it?
    h
    • 2
    • 1
  • a

    Abhishek Sachdeva

    11/02/2022, 1:22 PM
    Airbyte missed a record while syncing from hubspot to postgres, I catched this error manually as there were 100 records only. How do I make sure this doesn’t happen at scale? Is there any way to catch these errors manually? My guess: Somewhere while requesting the data from hubspot, airbyte passed some timestamp (like last_synced) in payload. Hubspot didn’t return the data itself as the objects were created a few seconds apart only.
    s
    • 2
    • 4
  • d

    DR

    11/02/2022, 1:26 PM
    I am trying to start airbyte. But docker-compose up fails with the below error
    Copy code
    ERROR: Head "<https://registry-1.docker.io/v2/airbyte/init/manifests/0.40.18>": unauthorized: incorrect username or password
    Copy code
    work/software/airbyte2/airbyte (master)$ sudo docker-compose up
    WARNING: The RUN_DATABASE_MIGRATION_ON_STARTUP variable is not set. Defaulting to a blank string.
    WARNING: The DEPLOYMENT_MODE variable is not set. Defaulting to a blank string.
    WARNING: The LOG_CONNECTOR_MESSAGES variable is not set. Defaulting to a blank string.
    WARNING: The SECRET_PERSISTENCE variable is not set. Defaulting to a blank string.
    WARNING: The JOB_ERROR_REPORTING_SENTRY_DSN variable is not set. Defaulting to a blank string.
    WARNING: The NEW_SCHEDULER variable is not set. Defaulting to a blank string.
    WARNING: The WORKER_ENVIRONMENT variable is not set. Defaulting to a blank string.
    WARNING: The GITHUB_STORE_BRANCH variable is not set. Defaulting to a blank string.
    WARNING: The REMOTE_CONNECTOR_CATALOG_URL variable is not set. Defaulting to a blank string.
    WARNING: The TEMPORAL_HISTORY_RETENTION_IN_DAYS variable is not set. Defaulting to a blank string.
    WARNING: The UPDATE_DEFINITIONS_CRON_ENABLED variable is not set. Defaulting to a blank string.
    Creating network "airbyte_default" with the default driver
    Creating network "airbyte_airbyte_internal" with the default driver
    Creating network "airbyte_airbyte_public" with the default driver
    Pulling init (airbyte/init:0.40.18)...
    ERROR: Head "<https://registry-1.docker.io/v2/airbyte/init/manifests/0.40.18>": unauthorized: incorrect username or password
    s
    • 2
    • 1
  • b

    Balaji Seetharaman

    11/02/2022, 1:36 PM
    Hi Team, I am getting this error while working on the new connector? Can anyone help me with this one?
    airbyte_cdk.sources.declarative.parsers.undefined_reference_exception.UndefinedReferenceException: Undefined reference definitions.base_requester.url_base from ('definitions', 'retriever', 'url_base')
    h
    • 2
    • 16
  • l

    laila ribke

    11/02/2022, 1:44 PM
    Hi all, I have a google ads -> Redshift connection with 7 Streams. IΒ΄ve set an Incremental sync mode (deduped + history), with a sync every 24 hours. I see the Redshift unblended cost is 450€ per day!!!!, which is impossible. Can you set a meeting with me to see what is the best practice on working with Redshift destination? Because as you can see in the logs below, it doesnΒ΄t load the data in one batch but runs every 5 seconds..
    h
    e
    • 3
    • 31
  • l

    laila ribke

    11/02/2022, 1:44 PM
    There is a query that run every 5 second(!!!!!!) that is the main cause of the mess we see: INSERT INTO indiana._airbyte_tmp_zyc_indiana_clickout ( _airbyte_ab_id, _airbyte_data, _airbyte_emitted_at ) VALUES ($ 1, JSON_PARSE($ 2), $ 3), ($ 4, JSON_PARSE($ 5), $ 6), ($ 7, JSON_PARSE($ 8), $ 9), ($ 10, JSON_PARSE($ 11), $ 12), ($ 13, JSON_PARSE($ 14), $ 15), ($ 16, JSON_PARSE($ 17), $ 18), ($ 19, JSON_PARSE($ 20), $ 21), ($ 22, JSON_PARSE($ 23), $ 24), ($ 25, JSON_PARSE($ 26), $ 27), ($ 28, JSON_PARSE($ 29), $ 30), ($ 31, JSON_PARSE($ 32), $ 33), ($ 34, JSON_PARSE($ 35), $ 36), ($ 37, JSON_PARSE($ 38), $ 39), ($ 40, JSON_PARSE($ 41), $ 42), ($ 43, JSON_PARSE($ 44), $ 45), ($ 46, JSON_PARSE($ 47), $ 48), ($ 49, JSON_PARSE($ 50), $ 51), ($ 52, JSON_PARSE($ 53), $ 54), ($ 55, JSON_PARSE($ 56), $ 57), ($ 58, JSON_PARSE($ 59), $ 60), ($ 61, JSON_PARSE($ 62), $ 63), ($ 64, JSON_PARSE($ 65), $ 66), ($ 67, JSON_PARSE($ 68), $ 69), ($ 70, JSON_PARSE($ 71), $ 72), ($ 73, JSON_PARSE($ 74), $ 75), ($ 76, JSON_PARSE($ 77), $ 78), ($ 79, JSON_PARSE($ 80), $ 81), ($ 82, JSON_PARSE($ 83), $ 84), ($ 85, JSON_PARSE($ 86), $ 87), ($ 88, JSON_PARSE($ 89), $ 90), ($ 91, JSON_PARSE($ 92), $ 93), ($ 94, JSON_PARSE($ 95), $ 96), ($ 97, JSON_PARSE($ 98), $ 99), ($ 100, JSON_PARSE($ 101), $ 102), ($ 103, JSON_PARSE($ 104), $ 105), ($ 106, JSON_PARSE($ 107), $ 108), ($ 109, JSON_PARSE($ 110), $ 111), ($ 112, JSON_PARSE($ 113), $ 114), ($ 115, JSON_PARSE($ 116), $ 117), ($ 118, JSON_PARSE($ 119), $ 120), ($ 121, JSON_PARSE($ 122), $ 123), ($ 124, JSON_PARSE($ 125), $ 126), ($ 127, JSON_PARSE($ 128), $ 129), ($ 130, JSON_PARSE($ 131), $ 132), ($ 133, JSON_PARSE($ 134), $ 135), ($ 136, JSON_PARSE($ 137), $ 138), ($ 139, JSON_PARSE($ 140), $ 141), ($ 142, JSON_PARSE($ 143), $ 144), ($ 145, JSON_PARSE($ 146), $ 147), ($ 148, JSON_PARSE($ 149), $ 150), ($ 151, JSON_PARSE($ 152), $ 153), ($ 154, JSON_PARSE($ 155), $ 156), ($ 157, JSON_PARSE($ 158), $ 159), ($ 160, JSON_PARSE($ 161), $ 162), ($ 163, JSON_PARSE($ 164), $ 165), ($ 166, JSON_PARSE($ 167), $ 168), ($ 169, JSON_PARSE($ 170), $ 171), ($ 172, JSON_PARSE($ 173), $ 174), ($ 175, JSON_PARSE($ 176), $ 177), ($ 178, JSON_PARSE($ 179), $ 180), ($ 181, JSON_PARSE($ 182), $ 183), ($ 184, JSON_PARSE($ 185), $ 186), ($ 187, JSON_PARSE($ 188), $ 189), ($ 190, JSON_PARSE($ 191), $ 192), ($ 193, JSON_PARSE($ 194), $ 195), ($ 196, JSON_PARSE($ 197), $ 198), ($ 199, JSON_PARSE($ 200), $ 201), ($ 202, JSON_PARSE($ 203), $ 204), ($ 205, JSON_PARSE($ 206), $ 207), ($ 208, JSON_PARSE($ 209), $ 210), ($ 211, JSON_PARSE($ 212), $ 213), ($ 214, JSON_PARSE($ 215), $ 216), ($ 217, JSON_PARSE($ 218), $ 219), ($ 220, JSON_PARSE($ 221), $ 222), ($ 223, JSON_PARSE($ 224), $ 225), ($ 226, JSON_PARSE($ 227), $ 228), ($ 229, JSON_PARSE($ 230), $ 231), ($ 232, JSON_PARSE($ 233), $ 234), ($ 235, JSON_PARSE($ 236), $ 237), ($ 238, JSON_PARSE($ 239), $ 240), ($ 241, JSON_PARSE($ 242), $ 243), ($ 244, JSON_PARSE($ 245), $ 246), ($ 247, JSON_PARSE($ 248), $ 249), ($ 250, JSON_PARSE($ 251), $ 252), ($ 253, JSON_PARSE($ 254), $ 255), ($ 256, JSON_PARSE($ 257), $ 258), ($ 259, JSON_PARSE($ 260), $ 261), ($ 262, JSON_PARSE($ 263), $ 264), ($ 265, JSON_PARSE($ 266), $ 267), ($ 268, JSON_PARSE($ 269), $ 270), ($ 271, JSON_PARSE($ 272), $ 273), ($ 274, JSON_PARSE($ 275), $ 276), ($ 277, JSON_PARSE($ 278), $ 279), ($ 280, JSON_PARSE($ 281), $ 282), ($ 283, JSON_PARSE($ 284), $ 285), ($ 286, JSON_PARSE($ 287), $ 288), ($ 289, JSON_PARSE($ 290), $ 291), ($ 292, JSON_PARSE($ 293), $ 294), ($ 295, JSON_PARSE($ 296), $ 297), ($ 298, JSON_PARSE($ 299), $ 300), ($ 301, JSON_PARSE($ 302), $ 303), ($ 304, JSON_PARSE($ 305), $ 306), ($ 307, JSON_PARSE($ 308), $ 309), ($ 310, JSON_PARSE($ 311), $ 312), ($ 313, JSON_PARSE($ 314), $ 315), ($ 316, JSON_PARSE($ 317), $ 318), ($ 319, JSON_PARSE($ 320), $ 321), ($ 322, JSON_PARSE($ 323), $ 324), ($ 325, JSON_PARSE($ 326), $ 327), ($ 328, JSON_PARSE($ 329), $ 330), ($ 331, JSON_PARSE($ 332), $ 333), ($ 334, JSON_PARSE($ 335), $ 336), ($ 337, JSON_PARSE($ 338), $ 339), ($ 340, JSON_PARSE($ 341), $ 342), ($ 343, JSON_PARSE($ 344), $ 345), ($ 346, JSON_PARSE($ 347), $ 348), ($ 349, JSON_PARSE($ 350), $ 351), ($ 352, JSON_PARSE($ 353), $ 354), ($ 355, JSON_PARSE($ 356), $ 357), ($ 358, JSON_PARSE($ 359), $ 360), ($ 361, JSON_PARSE($ 362), $ 363), ($ 364, JSON_PARSE($ 365), $ 366), ($ 367, JSON_PARSE($ 368), $ 369), ($ 370, JSON_PARSE($ 371), $ 372), ( $ 373, JSON_PARSE($ 374), $
  • p

    Paul Rus

    11/02/2022, 1:52 PM
    I have a question, I'm using Airbyte to migrate data from BigQuery to Postgres, both being in the Google Cloud
    j
    s
    • 3
    • 13
  • n

    Nikita Nazarov

    11/02/2022, 3:25 PM
    Hi when i’m try to update airbyte from
    0.39.13
    to
    0.40.17
    via helm chart I encounter the following error on server pod:
    Copy code
    2022-11-02 15:28:00 INFO c.z.h.HikariDataSource(<init>):80 - HikariPool-2 - Starting...
    2022-11-02 15:28:00 INFO c.z.h.HikariDataSource(<init>):82 - HikariPool-2 - Start completed.
    2022-11-02 15:28:00 DEBUG c.z.h.p.HikariPool(logPoolState):414 - HikariPool-1 - Before cleanup stats (total=0, active=0, idle=0, waiting=0)
    2022-11-02 15:28:00 DEBUG c.z.h.p.HikariPool(logPoolState):414 - HikariPool-1 - After cleanup  stats (total=0, active=0, idle=0, waiting=0)
    2022-11-02 15:28:00 DEBUG c.z.h.p.HikariPool(fillPool):521 - HikariPool-1 - Fill pool skipped, pool has sufficient level or currently being filled (queueDepth=0).
    2022-11-02 15:28:00 DEBUG c.z.h.p.HikariPool(logPoolState):414 - HikariPool-2 - Before cleanup stats (total=0, active=0, idle=0, waiting=0)
    2022-11-02 15:28:00 DEBUG c.z.h.p.HikariPool(logPoolState):414 - HikariPool-2 - After cleanup  stats (total=0, active=0, idle=0, waiting=0)
    2022-11-02 15:28:00 DEBUG c.z.h.p.HikariPool(fillPool):521 - HikariPool-2 - Fill pool skipped, pool has sufficient level or currently being filled (queueDepth=0).
    2022-11-02 15:28:01 DEBUG o.f.c.i.l.s.Slf4jLog(debug):33 - Scanning for classpath resources at 'classpath:db/callback' ...
    2022-11-02 15:28:01 DEBUG o.f.c.i.l.s.Slf4jLog(debug):33 - Determining location urls for classpath:db/callback using ClassLoader jdk.internal.loader.ClassLoaders$AppClassLoader@7a4f0f29 ...
    2022-11-02 15:28:01 DEBUG o.f.c.i.l.s.Slf4jLog(debug):33 - Unable to resolve location classpath:db/callback.
    2022-11-02 15:28:01 DEBUG o.f.c.i.l.s.Slf4jLog(debug):33 - Scanning for classpath resources at 'classpath:db/callback' ...
    2022-11-02 15:28:01 DEBUG o.f.c.i.l.s.Slf4jLog(debug):33 - Determining location urls for classpath:db/callback using ClassLoader jdk.internal.loader.ClassLoaders$AppClassLoader@7a4f0f29 ...
    2022-11-02 15:28:01 DEBUG o.f.c.i.l.s.Slf4jLog(debug):33 - Unable to resolve location classpath:db/callback.
    2022-11-02 15:22:16 DEBUG i.a.c.h.LogClientSingleton(setWorkspaceMdc):150 - Setting kube workspace mdc
    2022-11-02 15:22:16 ERROR i.a.s.ServerApp(main):326 - Server failed
    java.lang.IllegalArgumentException: null
        at com.google.common.base.Preconditions.checkArgument(Preconditions.java:131) ~[guava-31.1-jre.jar:?]
        at io.airbyte.config.storage.DefaultS3ClientFactory.validateBase(DefaultS3ClientFactory.java:36) ~[io.airbyte.airbyte-config-config-models-0.40.17.jar:?]
        at io.airbyte.config.storage.DefaultS3ClientFactory.validate(DefaultS3ClientFactory.java:31) ~[io.airbyte.airbyte-config-config-models-0.40.17.jar:?]
        at io.airbyte.config.storage.DefaultS3ClientFactory.<init>(DefaultS3ClientFactory.java:24) ~[io.airbyte.airbyte-config-config-models-0.40.17.jar:?]
        at io.airbyte.config.helpers.CloudLogs.createCloudLogClient(CloudLogs.java:45) ~[io.airbyte.airbyte-config-config-models-0.40.17.jar:?]
        at io.airbyte.config.helpers.LogClientSingleton.createCloudClientIfNull(LogClientSingleton.java:164) ~[io.airbyte.airbyte-config-config-models-0.40.17.jar:?]
        at io.airbyte.config.helpers.LogClientSingleton.setWorkspaceMdc(LogClientSingleton.java:151) ~[io.airbyte.airbyte-config-config-models-0.40.17.jar:?]
        at io.airbyte.server.ServerApp.getServer(ServerApp.java:174) ~[io.airbyte-airbyte-server-0.40.17.jar:?]
        at io.airbyte.server.ServerApp.main(ServerApp.java:323) ~[io.airbyte-airbyte-server-0.40.17.jar:?]
    2022-11-02 15:22:16 INFO c.z.h.HikariDataSource(close):350 - HikariPool-1 - Shutdown initiated...
    πŸ‘€ 1
    r
    s
    d
    • 4
    • 6
  • y

    Yifei Yin

    11/02/2022, 3:25 PM
    Hey team, encountering error in docker:
    airbyte-server/0.40.17
    with
    airbyte/source-s3/v0.1.25
    connector. Couldn’t find similar errors online.
    airbyte/source-postgres/v1.0.18
    suffers from the same issue. The same issue can be observed when: Setting up connections, Refresh source schema, Adding connectors. This error happened after
    docker-compose kill; docker-compose up
    .
    kill
    was used after the EC2 container hung for 12 hours running an
    s3 -> postgres
    job, and
    down
    failed multiple times.
    h
    • 2
    • 5
  • z

    Zaza Javakhishvili

    11/02/2022, 3:29 PM
    Hi guys, Someone can take attention to merge Amazon SP changes? https://github.com/airbytehq/airbyte/pull/18683 https://github.com/airbytehq/airbyte/pull/18283
    βœ… 1
  • c

    Coleman Kelleghan

    11/02/2022, 6:23 PM
    Hi Airbyte, We are seeing failed syncs when syncing with a GA (Universal Analytics) sources. The Error appears to be related to the type of a ga_datehourminute__dbt_alter column:
    Copy code
    Message: Normalization failed during the dbt run. This may indicate a problem with the data itself.
    Copy code
    2022-11-01 18:56:57 [42mnormalization[0m > 25 of 27 OK created incremental model public.table_ga_session_descriptors_2............................................. [[32mINSERT 0 185295[0m in 59.57s]
    2022-11-01 18:56:57 [42mnormalization[0m > Finished running 27 incremental models in 104.87s.
    2022-11-01 18:56:57 [42mnormalization[0m > [31mCompleted with 1 error and 0 warnings:[0m
    2022-11-01 18:56:57 [42mnormalization[0m > [33mDatabase Error in model table_ga_session_descriptors_1 (models/generated/airbyte_incremental/public/table_ga_session_descriptors_1.sql)[0m
    2022-11-01 18:56:57 [42mnormalization[0m >   column "ga_datehourminute__dbt_alter" is of type bigint but expression is of type text
    2022-11-01 18:56:57 [42mnormalization[0m >   LINE 4: ...scriptors_1" set "ga_datehourminute__dbt_alter" = "ga_dateho...
    2022-11-01 18:56:57 [42mnormalization[0m >                                                                ^
    2022-11-01 18:56:57 [42mnormalization[0m >   HINT:  You will need to rewrite or cast the expression.
    2022-11-01 18:56:57 [42mnormalization[0m > Done. PASS=26 WARN=0 ERROR=1 SKIP=0 TOTAL=27
    2022-11-01 18:56:57 [32mINFO[m i.a.w.p.KubePodProcess(close):737 - (pod: homelander-airbyte-external / normalization-normalize-4-0-aclyw) - Closed all resources for pod
    2022-11-01 18:56:57 [32mINFO[m i.a.w.g.DefaultNormalizationWorker(run):82 - Normalization executed in 2 minutes 13 seconds.
    2022-11-01 18:56:57 [1;31mERROR[m i.a.w.g.DefaultNormalizationWorker(run):90 - Normalization Failed.
    2022-11-01 18:56:57 [32mINFO[m i.a.w.g.DefaultNormalizationWorker(run):95 - Normalization summary: io.airbyte.config.NormalizationSummary@5c90d2f3[startTime=1667328884085,endTime=1667329017823,failures=[io.airbyte.config.FailureReason@52784d4d[failureOrigin=normalization,failureType=system_error,internalMessage=column "ga_datehourminute__dbt_alter" is of type bigint but expression is of type text,externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@26f99ec5[additionalProperties={attemptNumber=0, jobId=4, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    20 of 27 ERROR creating incremental model public.table_ga_session_descriptors_1......................................... [[31mERROR[0m in 37.84s]
    [33mDatabase Error in model table_ga_session_descriptors_1 (models/generated/airbyte_incremental/public/table_ga_session_descriptors_1.sql)[0m
      column "ga_datehourminute__dbt_alter" is of type bigint but expression is of type text
      LINE 4: ...scriptors_1" set "ga_datehourminute__dbt_alter" = "ga_dateho...
                                                                   ^
      HINT:  You will need to rewrite or cast the expression.
    20 of 27 ERROR creating incremental model public.table_ga_session_descriptors_1......................................... [[31mERROR[0m in 37.84s]
    [33mDatabase Error in model table_ga_session_descriptors_1 (models/generated/airbyte_incremental/public/table_ga_session_descriptors_1.sql)[0m
      column "ga_datehourminute__dbt_alter" is of type bigint but expression is of type text
      LINE 4: ...scriptors_1" set "ga_datehourminute__dbt_alter" = "ga_dateho...
                                                                   ^
      HINT:  You will need to rewrite or cast the expression.,retryable=<null>,timestamp=1667329017641], io.airbyte.config.FailureReason@1be29b24[failureOrigin=normalization,failureType=system_error,internalMessage=column "ga_datehourminute__dbt_alter" is of type bigint but expression is of type text,externalMessage=Normalization failed during the dbt run. This may indicate a problem with the data itself.,metadata=io.airbyte.config.Metadata@41bfd5e1[additionalProperties={attemptNumber=0, jobId=4, from_trace_message=true}],stacktrace=AirbyteDbtError: 
    20 of 27 ERROR creating incremental model public.table_ga_session_descriptors_1......................................... [[31mERROR[0m in 37.84s]
    [33mDatabase Error in model table_ga_session_descriptors_1 (models/generated/airbyte_incremental/public/table_ga_session_descriptors_1.sql)[0m
      column "ga_datehourminute__dbt_alter" is of type bigint but expression is of type text
      LINE 4: ...scriptors_1" set "ga_datehourminute__dbt_alter" = "ga_dateho...
                                                                   ^
      HINT:  You will need to rewrite or cast the expression.
    20 of 27 ERROR creating incremental model public.table_ga_session_descriptors_1......................................... [[31mERROR[0m in 37.84s]
    [33mDatabase Error in model table_ga_session_descriptors_1 (models/generated/airbyte_incremental/public/table_ga_session_descriptors_1.sql)[0m
      column "ga_datehourminute__dbt_alter" is of type bigint but expression is of type text
      LINE 4: ...scriptors_1" set "ga_datehourminute__dbt_alter" = "ga_dateho...
                                                                   ^
      HINT:  You will need to rewrite or cast the expression.,retryable=<null>,timestamp=1667329017641]]]
    2022-11-01 18:56:57 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - 
    2022-11-01 18:56:57 [32mINFO[m i.a.c.i.LineGobbler(voidCall):114 - ----- END DEFAULT NORMALIZATION -----
    Is this a problem with how GA is providing the ga_datehourminute values? Thanks
    h
    • 2
    • 4
  • j

    JoΓ£o Larrosa

    11/02/2022, 6:26 PM
    Hey mates! I'm trying to add new sources through the API, but I can't find the schemas to each source for the 'connectionConfiguration' field. May somebody help me out with it? Thank you very much!
    e
    • 2
    • 2
  • e

    Ethan Brouwer

    11/02/2022, 6:34 PM
    Does anyone know if it's possible to use redshift as the destination with a custom table schema? I have really simple JSON records coming from the source. Nothing nested. Literally just a key->value map. Could I use a more generic jdbc connector perhaps? Just seems so inefficient to just throw everything into a JSON blob.
    • 1
    • 1
  • l

    Leo G

    11/02/2022, 8:08 PM
    I'm trying setup snowflake source or destination and I get "The connection tests failed. non-json response" What am I doing wrong?
    j
    s
    • 3
    • 6
  • b

    Brian Castelli

    11/02/2022, 9:23 PM
    Hello, All. I am trying to create an Azure Table Storage source via the AirByte GUI. Azure Table <> S3 bucket. The connection and sync succeeds, but I only get the first column of my data in the S3 destination file. This seems to be expected based on what I see for the connection's Replication streams output:
    n
    • 2
    • 13
  • s

    Slackbot

    11/02/2022, 9:25 PM
    This message was deleted.
  • a

    agathianspy

    11/02/2022, 9:53 PM
    is Airbyte open source a good solution for automating a once daily download of a file from an ftp or site that has it's own data structure, extract just a select part of that data and put it into a new CSV document, and then finally upload that through a new sftp?
    s
    n
    • 3
    • 4
  • t

    Thomas Xiong

    11/02/2022, 11:19 PM
    is there an easy way to use the
    discover
    method locally to help generate a
    configured_catalog.json
    file for local testing without using the UI?
    h
    • 2
    • 1
  • t

    Thomas Xiong

    11/03/2022, 12:17 AM
    we’re trying to use the local UI to develop a new source, but when we try to set up our source in the UI, the options we specify in
    spec.yaml
    don’t show up in the modal. Running
    main.py spec
    works locally, and it looks like the front end is successfully fetching our
    spec.yaml
    does anyone know what might be the issue here?
    s
    • 2
    • 3
  • m

    Michael Zhou

    11/03/2022, 1:21 AM
    Hello, all. I've been building a new source connector using the low-code CDK but got a similar error to this one with my acceptance tests, even though these commands work perfectly on my end:
    Copy code
    python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
    The linked error was apparently due to the /var/run/docker.sock symlink being missing, but I'm not sure how to resolve it. Any ideas?
    n
    • 2
    • 5
  • o

    Oleg Lipunov

    11/03/2022, 4:01 AM
    Hello fellow Airbyters! This is a Facebook question / request for information I have a dev app that connects to business app on Facebook. This dev up was built specifically for Airbyte and nothing else is using it. I am pulling data only from ads_insights. I already asked and was approved by Facebook for Advanced Access. The app review was approved. Business was verified. I am nowhere near the application-level rate limit, here in FB dashboard I am almost always at 100% remaining(even when I am throttled). I am running updated Airbyte version 0.40.18 data is going into GCP - BQ every 2 hours. With Loading method as GCS Staging. Source Settings: Start date: 2022-01-28T000000Z End date: blank Include Deleted: off Fetch Thumbnail Images: off Page Size of requests: 10 Insights Lookback Window: 7 Maximum size of Batched Requests: 50 Airbyte is located on Compute Engine with 2 vCPUs and 8 Gigs of ram and 120 Gigs of disk space. Why am I still throttled? It takes over 1 hour and 20 minutes to transfer 80mb of data where my Postgress instance does 200 mb in less than 4 mins. Any help, information, or feedback is very much appreciated.
    n
    o
    • 3
    • 8
  • m

    Max Ferguson

    11/03/2022, 4:01 AM
    Hi All, Is there any plan to add the new Early Fraud Warnings endpoint (https://stripe.com/docs/api/radar/early_fraud_warnings) to the Stripe Source Connector (GA)? @John (Airbyte)
    s
    • 2
    • 2
  • p

    Phuc Dinh Minh

    11/03/2022, 5:59 AM
    Hi guys, I'm developing a custom tool, and I see the Advanced_auth and authSpecification in code of some source. What do those config do? I did try to read airbyte protocol code-base, but still don't understand what they do actually.
    s
    • 2
    • 1
  • s

    Siddhant Singh

    11/03/2022, 8:46 AM
    hi. As we are implementing custom connector for the source. This source expects to pull data from the api's. So data which we receive we pushed to Postgres in the destination. Which created a column as Jsonb and store the raw json there. After storing the data we were trying to implement the basic normalization and we are getting some issue after that. So we have created this issue where you can read more about this issue https://github.com/airbytehq/airbyte/issues/18904 To get to the code which we have implemented is here https://github.com/DevDataPlatform/airbyte/blob/master/airbyte-integrations/connectors/source-surveycto/source_surveycto/source.py Please suggest what we can do here. Either the documents is suggesting something else which is not clear to us. Document for basic normalization - https://docs.airbyte.com/understanding-airbyte/basic-normalization#destinations-that-support-basic-normalization
    s
    • 2
    • 1
1...868788...245Latest