https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • y

    Yann Phan Dinh

    03/24/2023, 9:24 AM
    Hey, I get this error with my custom connector build with low code CDK :
    Copy code
    java.util.concurrent.ExecutionException: java.lang.RuntimeException: No properties node in stream schema
    	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:272) ~[io.airbyte-airbyte-commons-worker-0.42.0.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:190) ~[io.airbyte-airbyte-commons-worker-0.42.0.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:94) ~[io.airbyte-airbyte-commons-worker-0.42.0.jar:?]
    	at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$6(TemporalAttemptExecution.java:202) ~[io.airbyte-airbyte-workers-0.42.0.jar:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Caused by: java.lang.RuntimeException: No properties node in stream schema
    	at io.airbyte.workers.general.DefaultReplicationWorker.populateStreamToAllFields(DefaultReplicationWorker.java:727) ~[io.airbyte-airbyte-commons-worker-0.42.0.jar:?]
    	at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:391) ~[io.airbyte-airbyte-commons-worker-0.42.0.jar:?]
    	at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    	... 1 more
  • y

    Yann Phan Dinh

    03/24/2023, 9:25 AM
    The Data couldn't be pull because of that. Any idea?
  • a

    Aman Kesharwani

    03/24/2023, 9:46 AM
    Hi, I am trying to deploy airbyte in kubernetes cluster using helm where airbyte db will be provisioned outside of the kubernetes cluster other than changing the airbyte db endpoints and credentials in the values.yaml file do I need to make any other changes in the chart configuration ?
    n
    • 2
    • 4
  • a

    Archit Singh

    03/24/2023, 10:39 AM
    hey guys, i wanted your help on airbyte replication. i am trying to replicate a DB from mssql to postgres and it keeps failing after 3 attempts, i checked the log, all i see is this error
    Copy code
    2023-03-24 10:29:08 INFO i.a.v.j.JsonSchemaValidator(test):130 - JSON schema validation failed.
    errors: $: does not match the regex pattern ^(?:[A-Za-z0-9+/]{4})*(?:[A-Za-z0-9+/]{2}==|[A-Za-z0-9+/]{3}=)?$
    what does this mean, if anyone could help me?
  • i

    Idris Osman

    03/24/2023, 12:41 PM
    Hi, Has anyone come across a Failed to fetch schema error for a data sources with a large amount of tables(MySQL to GCS)? Refer to link for context.
    s
    • 2
    • 2
  • p

    Pierre CORBEL

    03/24/2023, 12:52 PM
    String and date-time format
    Hello there 👋, It seems that Airbyte doesn't recognize string with datetime format and it causes problem on dbt normalization 💥 In my schema:
    Copy code
    "_sdc_sales_updated_at": {"type": "string", "format": "date-time"},
    On Airbyte UI (it seems that Unknown is considered as variant for dbt afterward) Does someone already had the same problem? 🧐 Am I making something wrong?
    n
    • 2
    • 1
  • j

    Johannes MĂźller

    03/24/2023, 1:10 PM
    Hi! Here is a small bugfix for the octavia cli install script: https://github.com/airbytehq/airbyte/pull/24459 Apparently the docker image mentioned in the script does not exists and makes the script fail. I updated it to the latest version 🙂
    octavia thanks 1
    s
    • 2
    • 1
  • t

    Thiago Villani

    03/24/2023, 2:23 PM
    Hello, I activated CDC in MSSQL, it's working correctly, in airbyte, which is the best option to choose?: Incremental Sync - Append or Incremental Sync - Deduped History And when I execute the second sync, which does not take any data, the airbyte reads the whole table again, but the sync time takes twice as long as the first sync that was full, how can I improve this time?
    n
    • 2
    • 2
  • l

    Leo Allgaier

    03/24/2023, 3:44 PM
    Hey Guys, maybe some of you could help me. I'm trying to set up a source for shopify right now. When I'm clicking on set up source it shows me this error. Does someone know, why it appears?
  • r

    Robert Put

    03/24/2023, 4:05 PM
    I see column selection in airbyte cloud for a mysql to snowflake connection. In a different OSS version of airbyte still don't see any release notes on this feature, would it be expected with the next OSS platform version?
    👀 1
    a
    a
    • 3
    • 3
  • y

    Yuri Maia Santana

    03/24/2023, 4:24 PM
    Hey guys! I need help regarding a deployment I'm doing with the open source version. I'm new to the tool and I'm still getting to know its full power. I did the Airbyte setup on a VM and created the settings to run an ELT, but now I need to switch to another VM and I don't want to redo all the data source and target source settings. Is there any way to do this? Or use git for this change? Is there a folder in Airbyte that has these settings saved?
    r
    n
    • 3
    • 3
  • g

    Gabriel Levine

    03/24/2023, 4:58 PM
    Does anyone know why a connector update wouldn’t be “sticking”? I’ve updated Postgres and Salesforce sources to the latest version a few times. The changes are saved and the current version is updated, but when I go back and look later they’ve reverted. If relevant, these are both major version updates to active connectors. I’m running Airbyte 0.40.25.
    p
    • 2
    • 20
  • a

    Adam Roderick

    03/24/2023, 8:39 PM
    I created a PR for a connector about 3 weeks ago. I can't get anyone to engage to help me understand what I am missing about the approval process. Can anyone help?
  • m

    Matheus Barbosa

    03/24/2023, 8:50 PM
    Can someone please help me with this topic? I’m trying to get some help and I’m really struggling with that but I have no idea how to solve it! https://airbytehq.slack.com/archives/C027KKE4BCZ/p1679603839492009
  • d

    Daniel Zuluaga

    03/25/2023, 5:04 AM
    Could someone please help me figure out why I can't set up a Google Ads connection? (Airbyte running on a EC2 instance on my own deployment)
  • o

    Omprakash Kumar

    03/25/2023, 8:39 AM
    hii Airbyte server is giving CORS error when try to access Api via different react application
    j
    • 2
    • 5
  • j

    Johannes MĂźller

    03/25/2023, 10:07 AM
    Could you point me to the mysql connector code logic for the incremental sync mode? The code is a bit hard to navigate -.- I would expect it to be in MySQLSource.java but or MySQLSourceOperations.java, but can't locate it. The incremental mode does not work at all for me (I either get duplicates or only parts of the source table :D), so I'd like to have a look at the code to see if it's a simple fix, otherwise I can add a bug report.
  • j

    Johannes MĂźller

    03/25/2023, 10:11 AM
    @Vitalii Maltsev [GL] Maybe you could help?
  • r

    Rahul Sahay

    03/25/2023, 1:32 PM
    Hi @channel, i have a use case where the source system happens to be SAP ECC. i would like to understand what capabilities do airbyte have? Thanks
  • l

    Leo Allgaier

    03/25/2023, 3:37 PM
    Does someone have an idea? Would be grateful for an answer! Thanks in advance!
  • a

    Anton Marini

    03/25/2023, 7:35 PM
    Hi friends. Question about the Postgres syncing mechanisms. Im reading through: https://docs.airbyte.com/integrations/sources/postgres/#configuring-postgres-connector-with-change-data-capture-cdc
    The connector waits for the default initial wait time of 5 minutes (300 seconds). Setting the parameter to a longer duration will result in slower syncs, while setting it to a shorter duration may cause the connector to not have enough time to create the initial snapshot or read through the change logs. The valid range is 120 seconds to 1200 seconds.
    Does this mean if I am syncing data via CDC, it isnt using any pub / sub event notifications, but a sort of polling mechanism that runs minimum every 2 minutes? Sorry if im missing something obvious, im new to CDC and trying to wrap my head around the moving parts
    • 1
    • 2
  • b

    Brian Castelli

    03/26/2023, 1:31 AM
    One of my AirByte clusters was installed using Kustomize. I want to upgrade to the latest AirByte. I can’t find instructions for doing such an upgrade using helm. What is the procedure for upgrading a Kustomize installation? (Yeah, I saw the deprecation notice, but the linked commit doesn’t seem to have anything to do with what the page is talking about. See below.)
  • j

    Johannes MĂźller

    03/26/2023, 9:00 AM
    Hi, Why do the auto generated models contain a
    hashid
    column with an md5 based on the row values? What is it used for?
    • 1
    • 7
  • p

    Pankaj

    03/26/2023, 11:09 AM
    Getting this error when i increase the load. No spike in CPU or Memory. Source: Postgress, Destination: S3
    j
    s
    • 3
    • 8
  • p

    Pankaj

    03/26/2023, 11:09 AM
    details.,metadata=io.airbyte.config.Metadata@27fd8ea1[additionalProperties={attemptNumber=0, jobId=1819, from_trace_message=true, connector_command=read}],stacktrace=java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.InterruptedException at io.airbyte.integrations.debezium.internals.DebeziumRecordIterator.requestClose(DebeziumRecordIterator.java:195) at io.airbyte.integrations.debezium.internals.DebeziumRecordIterator.computeNext(DebeziumRecordIterator.java:141) at io.airbyte.integrations.debezium.internals.DebeziumRecordIterator.computeNext(DebeziumRecordIterator.java:34) at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) at com.google.common.collect.TransformedIterator.hasNext(TransformedIterator.java:46) at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38) at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) at io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.java:63) at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) at io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.java:63) at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38) at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38) at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132) at io.airbyte.integrations.base.IntegrationRunner.lambda$produceMessages$0(IntegrationRunner.java:187) at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:237) at io.airbyte.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.java:186) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:139) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:98) at io.airbyte.integrations.base.adaptive.AdaptiveSourceRunner$Runner.run(AdaptiveSourceRunner.java:86) at io.airbyte.integrations.source.postgres.PostgresSourceRunner.main(PostgresSourceRunner.java:15) Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.InterruptedException at io.airbyte.integrations.debezium.internals.DebeziumRecordPublisher.close(DebeziumRecordPublisher.java:105) at io.airbyte.commons.concurrency.VoidCallable.call(VoidCallable.java:15) at io.airbyte.integrations.debezium.internals.DebeziumRecordIterator.requestClose(DebeziumRecordIterator.java:192) ... 28 more Caused by: java.lang.RuntimeException: java.lang.InterruptedException at io.airbyte.integrations.debezium.internals.DebeziumRecordPublisher.lambda$start$0(DebeziumRecordPublisher.java:68) at io.debezium.embedded.ConvertingEngineBuilder.lambda$notifying$0(ConvertingEngineBuilder.java:72) at io.debezium.embedded.EmbeddedEngine$1.handleBatch(EmbeddedEngine.java:473) at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:822) at io.debezium.embedded.ConvertingEngineBuilder$2.run(ConvertingEngineBuilder.java:192) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: java.lang.InterruptedException at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:1638) at java.base/java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:343) at io.airbyte.integrations.debezium.internals.DebeziumRecordPublisher.lambda$start$0(DebeziumRecordPublisher.java:65) ... 7 more ,retryable=<null>,timestamp=1679749998692], io.airbyte.config.FailureReason@3f4e8d99[failureOrigin=source,failureType=<null>,internalMessage=io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped!,externalMessage=Something went wrong within the source connector,metadata=io.airbyte.config.Metadata@e4148fb[additionalProperties={attemptNumber=0, jobId=1819, connector_command=read}],stacktrace=java.util.concurrent.CompletionException: io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped! at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) at java.base/java.lang.Thread.run(Thread.java:1589) Caused by: io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped! at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:392) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ... 3 more Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled. at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:162) at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:390) ... 4 more ,retryable=<null>,timestamp=1679749998833]]]
  • r

    Renat Zubayrov

    03/26/2023, 1:39 PM
    Hey channel, we have a Airbyte 0.41.0 deployed on #k8s and (probably due to failed attempt to configure gcp log storage) the sync processes are now stuck at the
    i.a.w.p.KubeProcessFactory(create):98 - Attempting to start pod
    any ideas what we could check to figure out what went wrong or which log files to check for more details?
    • 1
    • 4
  • z

    Zawar Khan

    03/27/2023, 10:23 AM
    HI Team, Can some body reply to my issue .
    👍 1
    m
    • 2
    • 1
  • s

    Shreepad Khandve

    03/27/2023, 1:53 PM
    HI all, Can somebody tell me the stop condition of below response for low code connector in pagination : I am getting link > next > url of the next page
    Copy code
    pagination_strategy:
              type: "CursorPagination"
              cursor_value: "{{ response.links.next }}"
              stop condition: ???
    response : , {................... "tag_ids": null, "user": {"company_id": "5d48461d2a86930001c87f23", "id": "5d5e6b0d8f7035000110d076", "role": "tester"}, "question": {"id": "5d6e834775b01d0001b8b933", "company_id": "5d48461d2a86930001c87f23", "question_type": "freeform", "prompt": "Tell us a bit more about why:", "options": {"size": "multi"}}}]}, "links": {"next": "https://iteratehq.com/api/v1/surveys/5d6e82a775b01d0001b7/responses?access_token=##############&amp;page[cursor]=5d6f14e2b27070798fab123"}}, "emitted_at": 1679925022875}}
    a
    • 2
    • 2
  • f

    Frans KrojegĂĽrd

    03/27/2023, 2:15 PM
    Hi, I ran into issues when integrating with the Zuora source. Large tables weren't working (> 10 million rows), so I fixed the query generation (https://github.com/airbytehq/airbyte/pull/24460) to get around Zuora's bug on that. The calls aren't failing any more, but now I have new issues. Now: • queries are seemingly run in duplicate, in sequence, • and are followed by LOTS of
    DESCRIBE <table>
    calls. Haven't checked if it ever finishes, but for a large table it ran at least a 100 calls before I stopped it. These errors are not present on the original Zuora connector. The only difference, that's not part of the linked PR is this issue I had with Oauth: https://github.com/frans-k/airbyte/commit/a1dec028dd243538cf8ad88bf941c1cffc153455 I'm unclear why I had issues with Oauth, but neither my PR nor my Oauth workaround quite explains why the Zuora connector started behaving the way it does now. Does this sound familiar to anyone? For me it feels very nonsensical, quite unsure where to look.
  • l

    Liam Coley

    03/27/2023, 3:00 PM
    Hi everyone, Just wondering if anyone has had any experience with this issue: We run several connections from S3 to Redshift. After upgrading to 0.42.0, these connections have begun repeatedly failing with the error
    Failed to start sync: non-json response
    . No job is even created. Watching the logs, this appears to be the only relevant error that is raised:
    Copy code
    airbyte-webapp                    | 2023/03/26 02:18:36 [error] 36#36: *3416 upstream prematurely closed connection while reading response header from upstream, client: 123.0.0.0, server: localhost, request: "POST /api/v1/connections/sync HTTP/1.0", upstream: "<http://123.0.0.0:8001/api/v1/connections/sync>", host: "localhost", referrer: "<http://localhost:8000/workspaces/b191bf96-69cf-4373-b09e-698e5f4f7ed0/connections/e15cbd03-da9e-43f4-8c78-ad90ffa43cb6/status>"
    I’m running this on Amazon EC2 t3.large instance, with 100GB of attached storage. As mentioned, it only started occurring after upgrading to 0.42.0. From looking through StackOverflow, it may be an nginx issue - unsure if there’s been a change there that may effect this? Any help is hugely appreciated. Thank you!
    n
    l
    • 3
    • 7
1...169170171...245Latest