https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • b

    Balaji Seetharaman

    10/31/2022, 5:37 PM
    Hi Team, I working on the following issue https://github.com/airbytehq/connector-contest/issues/199 and faced below issue while running tests Can someone please help me in getting it passed?
    FAILED ../../bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py::TestFullRefresh::test_sequential_reads[inputs0] - Failed: wifi_status: the two sequential reads should produce either equal set of records or one of th...
    Results (65.12s):
    23 passed
    1 failed
    - airbyte-integrations/bases/source-acceptance-test/source_acceptance_test/tests/test_full_refresh.py:39 TestFullRefresh.test_sequential_reads[inputs0]
    2 skipped
    (.venv) bseetharaman@EXT-C02D27NHMD6M source-influxdb %
    h
    • 2
    • 4
  • l

    Leo G

    10/31/2022, 5:47 PM
    Can I run airbyte as standalone without Docker?
    👀 1
    s
    • 2
    • 1
  • r

    Robert Put

    10/31/2022, 6:25 PM
    https://snyk.io/blog/new-openssl-critical-vulnerability/ will there need to be any patching for airbyte docker containers tomorrow to address these issues? I'm not sure what the base images being used where?
    m
    • 2
    • 4
  • l

    Lucas Gonthier

    10/31/2022, 6:26 PM
    Hello all, I'm using the Airbyte API to create connections and I would like to use basic normalization for BigQuery destination. However I can't find which operation_id I need to give to perform this normalization. From what I'm seeing, I should send something like this in the request ?
    "operations": [{"workspace_id": "f9d8801a-7ac7-40fc-8e9a-1dd497ca4d82", "operation_id": "007220dd-022f-461d-aa70-53a61e7f90fd", "name": "Normalization", "operator_configuration": {"operator_type": "normalization", "normalization": {"option": "basic"}}}]
    i
    l
    m
    • 4
    • 9
  • t

    Tarak dba

    10/31/2022, 6:54 PM
    Can we have a rows filter where we can use column with value so that we can only filter and replicate only few rows from source to destination? EX: where status=’A’ So, here we want to replicate rows only matching status which is ‘A’, and remaining rows should not be replicate from source to destination. Source Status column Values: A I C But Destination we should have only status ‘A’. A
    h
    • 2
    • 4
  • k

    Kevin Phan

    10/31/2022, 7:04 PM
    hey folks, I have attached my server logs. I am creating an S3 destination with a role that has read and write permissions to the s3 bucket of interest. I have everything setup which follows the architecture that i have attached where the IAM role is connected with an OIDC provider. The airbyte worker pod is connected to the airbyte Admin Service Account which is linked to the role via the OIDC provider. Everything should be working technically but i saw the in the server logs :
    Copy code
    2022-10-03 15:13:33 [33mWARN[m i.a.s.s.AirbyteGithubStore(getLatestSources):69 - Unable to retrieve latest Source list from Github. Using the list bundled with Airbyte. This warning is expected if this Airbyte cluster does not have internet access.
    java.net.http.HttpTimeoutException: request timed out
    	at jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:844) ~[java.net.http:?]
    	at jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:123) ~[java.net.http:?]
    	at io.airbyte.server.services.AirbyteGithubStore.getFile(AirbyteGithubStore.java:83) ~[io.airbyte-airbyte-server-0.40.0-alpha.jar:?]
    	at io.airbyte.server.services.AirbyteGithubStore.getLatestSources(AirbyteGithubStore.java:67) ~[io.airbyte-airbyte-server-0.40.0-alpha.jar:?]
    	at io.airbyte.server.handlers.SourceDefinitionsHandler.getLatestSources(SourceDefinitionsHandler.java:136) ~[io.airbyte-airbyte-server-0.40.0-alpha.jar:?]
    	at io.airbyte.server.handlers.SourceDefinitionsHandler.listLatestSourceDefinitions(SourceDefinitionsHandler.java:131) ~[io.airbyte-airbyte-server-0.40.0-alpha.jar:?]
    	at io.airbyte.server.apis.ConfigurationApi.execute(ConfigurationApi.java:873) ~[io.airbyte-airbyte-server-0.40.0-alpha.jar:?]
    	at io.airbyte.server.apis.ConfigurationApi.listLatestSourceDefinitions(ConfigurationApi.java:324) ~[io.airbyte-airbyte-server-0.40.0-alpha.jar:?]
    	at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[?:?]
    	at java.lang.reflect.Method.invoke(Method.java:578) ~[?:?]
    	at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:124) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:167) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:219) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:79) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:469) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:391) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:80) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:253) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:292) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:274) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.internal.Errors.process(Errors.java:244) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265) ~[jersey-common-2.31.jar:?]
    	at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:232) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680) ~[jersey-server-2.31.jar:?]
    	at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:366) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:319) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205) ~[jersey-container-servlet-core-2.31.jar:?]
    	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:763) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:569) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1377) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:507) ~[jetty-servlet-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1292) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.Server.handle(Server.java:501) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383) ~[jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:556) [jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375) [jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273) [jetty-server-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) [jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) [jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) [jetty-io-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938) [jetty-util-9.4.31.v20200723.jar:9.4.31.v20200723]
    	at java.lang.Thread.run(Thread.java:1589) [?:?]
    not entirely sure what this means. Could this be contributing to the desitnation not working?
    server-logs (3).txt
    • 1
    • 1
  • m

    Manish Tomar

    10/31/2022, 7:08 PM
    Help! Airbyte spins up snowflake warehouse when there are no rows emitted, how can we avoid or minimise warehouse cost when there are no records being inserted or updated?
    s
    • 2
    • 1
  • t

    Tarak dba

    10/31/2022, 7:09 PM
    Is Airbyte RRN based or Primary Key based replication? will it supports AS400 DB2 RRN key?
    m
    • 2
    • 5
  • t

    Tarak dba

    10/31/2022, 7:41 PM
    Will AirByte supports source as MariaDB? If supports then what connector I should use ? Can I use MySQL connector?
    h
    • 2
    • 3
  • j

    Jesse

    10/31/2022, 8:02 PM
    Hi everyone, i have been running into issues doing a transfer from PostgreSQL to MySQL. The stream runs fine for a certain period of time (can be anywhere from 15minutes to 2 hours) and then will end with an error: Connection org.postgresql.jdbc.PgConnection@69eb86b4 marked as broken because of SQLSTATE(08006), ErrorCode(0) Stack Trace: org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend. I have attached my logs here. I am not sure what is causing this and am not finding anything on web searches. Was wondering what could be causing this
    385df1f6_cd71_42aa_971b_3e5d87122eb0_logs_21_txt.txt
    p
    l
    • 3
    • 14
  • t

    Tarak dba

    10/31/2022, 8:03 PM
    Can we have a rows filter where we can use column with value so that we can only filter and replicate only few rows from source to destination?
    m
    • 2
    • 1
  • d

    Dvir Katz

    10/31/2022, 8:09 PM
    Hi Team, Is there a way to limit the number of records emmited by a source? Think about a kind of data preview where i’d like to see a sample of X number of records
    s
    • 2
    • 4
  • j

    Jhon Edison Bambague Calderon

    10/31/2022, 8:15 PM
    Hi team, does anyone know the column capacity limit of the s3 source connector for a csv file. I have a csv with more than 400 columns and it generates the following error
    s
    • 2
    • 3
  • j

    Jing Xu

    10/31/2022, 8:45 PM
    Hi team, since upgrading airbyte from v0.40.14 to v0.40.17, we've started to get an error: 413 Request Entity Too Large when refreshing source schema to load new stream. I read the same issue occurred in an older airbyte version and new release solved it. Is it a fix to this error in the future release?
    s
    d
    • 3
    • 9
  • w

    Will Curatolo

    10/31/2022, 9:06 PM
    Hey everyone 👋 Just starting to test out Airbyte locally, following the QuickStart instructions, but I just end up with this looping repeatedly:
    Copy code
    airbyte-temporal    | Waiting for PostgreSQL to startup.
    airbyte-temporal    | nc: bad address 'db'
    anyone else run into this?
    s
    • 2
    • 7
  • g

    Gary K

    11/01/2022, 12:46 AM
    Hi everyone. I’m trying to load data from Azure Blob Storage (put there by Airbyte) into a dedicated sql pool landing zone, (ie, with the _airbyte_data unflattened), and wondering if anyone has done or tried this before? I can’t seem to get azure to read json blob storage and unflatten 😖
    s
    • 2
    • 2
  • b

    Brendan McDonald

    11/01/2022, 12:55 AM
    Hey y'all, where can track upcoming releases? I saw historically it was weekly on Wednesday's, but looking on https://github.com/airbytehq/airbyte/releases seems like it is no longer following a weekly pattern
    • 1
    • 7
  • d

    Duck Psy

    11/01/2022, 3:57 AM
    Hi team, can i use MySQL ( instead of Postgresql ) as Database of Airbyte ?
    s
    • 2
    • 2
  • a

    Andrew Exlet

    11/01/2022, 6:12 AM
    Hi. I’ve setup a MySQL<>Snowflake connection using CDC from MySQL. It works great for exactly 24 hours then stops processing changes from the source. It states the sync has succeeded yet no records are emitted or committed. Lost on why. Can anyone provide an assist?
    n
    • 2
    • 9
  • a

    Aazam Thakur

    11/01/2022, 8:35 AM
    Is it possible to set field pointers for record selectors for each stream individually?
    s
    • 2
    • 4
  • g

    godlin ampcome

    11/01/2022, 9:33 AM
    HI. I try to insert my CSV file into my database using airbyte. The CSV file is changeable and has some extra space in the format. Is it possible to edit my CSV file using airbyte ..?
    • 1
    • 8
  • p

    Patrik Deke

    11/01/2022, 1:13 PM
    Hi guys is it somehow possible to export and import the configration of an airbyte instance (e.g. all configured connections, connectors etc.) ?
    d
    • 2
    • 13
  • a

    Adil Karim

    11/01/2022, 2:10 PM
    Hello! I have an existing Temporal cluster that I’d like to use with the open source Airbyte workers - is there a way to set the Temporal namespace as an option?
    h
    • 2
    • 4
  • d

    Dusty Shapiro

    11/01/2022, 2:14 PM
    Just updated the Airbyte Helm chart to
    0.40.38
    and getting an error when attempting to deploy to my dev environment:
    Copy code
    │ Error: error validating "": error validating data: [ValidationError(Deployment.spec.template.spec.containers[0]): unknown field "limits" in io.k8s.api.core.v1.Container, ValidationError(Deployment.spec.template.spec.containers[0]): unknown field "requests" in io.k8s.api.core.v1.Container]
    👀 1
    k
    n
    • 3
    • 22
  • d

    Duck Psy

    11/01/2022, 2:16 PM
    Hi team, can i use MySQL ( instead of Postgresql ) as Database of Airbyte ?
    s
    • 2
    • 2
  • r

    Rose Hooper

    11/01/2022, 3:17 PM
    I threw a couple PRs out there to address issues I've been having getting Airbyte going on AWS Graviton (arm64) instances... Is the team tagging misc PRs with hacktoberfest-accepted?
    • 1
    • 1
  • d

    Dan Cook

    11/01/2022, 3:25 PM
    Similar to the Fivetran log connector, is there a log connector on the horizon for Airbyte?
    Fivetran generates structured log events from connectors, dashboard user actions, and Fivetran API calls. The Fivetran Log Connector delivers your logs and account metadata to a schema in your destination. This metadata includes granular Fivetran consumption information.
    s
    • 2
    • 1
  • s

    Srinidhi krishnamurthy

    11/01/2022, 3:59 PM
    hi Team, we understand the temporal has an offering as a SAAS service , would like to know if we can use temporal SAAS instead of built-in temporal containers for our airbyte setup in prod and nonprod ?
    • 1
    • 2
  • s

    Sebastian Brickel

    11/01/2022, 4:17 PM
    Hi team, anyone else experienced this issue with Instagram where the connector fails due to:
    Additional Failure Information: 'end_time'
    if one includes the stream:
    user_lifetime_insights
    Without this stream the connector works as expected.
    h
    • 2
    • 2
  • d

    Dan Siegel

    11/01/2022, 4:55 PM
    • Is this your first time deploying Airbyte?: No • OS Version / Instance: Amazon Linux • Memory / Disk: 4Gb • Deployment: K8 • Airbyte Version: 0.40.11 • Source name/version: All • Destination name/version: Redshift with S3 Staging • Step: Worker Start up • Description: • This is the main message we have at the top of each sync log: • message=‘io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Running the launcher replication-orchestrator failed’, type=‘java.lang.RuntimeException’, nonRetryable=false The error above comes regardless of connector. Here’s a few cherry picked errors from the attached log as well • errors: $.access_key_id: object found, string expected, $.secret_access_key: object found, string expected • software.amazon.awssdk.services.s3.model.S3Exception: The request signature we calculated does not match the signature you provided. Check your key and signing method. (Service: S3, Status Code: 403, Request ID: 17233C9B8CA0B1E7) We are using the same S3 creds within the k8 deployment as we are within the connector. The credentials DO have blanket access to the bucket. We are able to generate objects in target bucket, and we do see all the logs getting generated into that bucket. Both source and destinations pass checks.
    Airbyte Logs.txt
1...858687...245Latest