https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Alan Klein

    06/29/2023, 7:10 PM
    Can Airbyte take a JSON response from a REST API, convert the output to a .csv and then upload it to S3?
    k
    • 2
    • 2
  • b

    Bob Blandford

    06/29/2023, 7:12 PM
    Could not connect with provided configuration. HikariPool-1 - Connection is not available, request timed out after 60001ms.
    k
    • 2
    • 2
  • b

    Bob Blandford

    06/29/2023, 7:15 PM
    How man connections to the same source can i have?
    k
    • 2
    • 2
  • b

    Bob Blandford

    06/29/2023, 7:33 PM
    can source and destination be the same database with different schemas?
    k
    • 2
    • 2
  • m

    Matt Pucci

    06/30/2023, 1:36 AM
    do syncs normally take significantly longer after a reset of the connection?
    k
    • 2
    • 4
  • d

    Danilo Drobac

    06/30/2023, 6:28 AM
    Has anybody successfully had a Google Ads app application accepted? I've tried twice to apply (with more than 7 days in between) and I haven't received anything in response...
    k
    • 2
    • 2
  • s

    Shubham Singh

    06/30/2023, 5:30 PM
    Hello, I have deployed airflow using official helm chart and now i am trying to setup Google search console as source but i am getting this error:
    Copy code
    Internal Server Error: The specified bucket does not exist (Service: S3, Status Code: 404, Request ID: 176D792AC02C652A, Extended Request ID: dd9025bab4ad464b049177c95eb6ebf374d3b3fd1af9251148b658df7ac2e3e8)
    I have S3 disabled and i am using minio.. Can someone tell me which bucket is it trying to look for? This is the log. i see in airbyte-server pod:
    Copy code
    at io.airbyte.commons.server.handlers.SchedulerHandler.checkSourceConnectionFromSourceCreate(SchedulerHandler.java:225) ~[io.airbyte-airbyte-commons-server-0.50.1.jar:?]
            at io.airbyte.server.apis.SchedulerApiController.lambda$executeSourceCheckConnection$1(SchedulerApiController.java:48) ~[io.airbyte-airbyte-server-0.50.1.jar:?]
            at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:26) ~[io.airbyte-airbyte-server-0.50.1.jar:?]
            at io.airbyte.server.apis.SchedulerApiController.executeSourceCheckConnection(SchedulerApiController.java:48) ~[io.airbyte-airbyte-server-0.50.1.jar:?]
            at io.airbyte.server.apis.$SchedulerApiController$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-server-0.50.1.jar:?]
            at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:371) ~[micronaut-inject-3.9.2.jar:3.9.2]
            at io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594) ~[micronaut-inject-3.9.2.jar:3.9.2]
            at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303) ~[micronaut-router-3.9.2.jar:3.9.2]
            at io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111) ~[micronaut-router-3.9.2.jar:3.9.2]
            at io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103) ~[micronaut-http-3.9.2.jar:3.9.2]
            at io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659) ~[micronaut-http-server-3.9.2.jar:3.9.2]
            at reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.publisher.InternalFluxOperator.subscribe(InternalFluxOperator.java:62) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.publisher.FluxSubscribeOn$SubscribeOnSubscriber.run(FluxSubscribeOn.java:194) ~[reactor-core-3.5.0.jar:3.5.0]
            at io.micronaut.reactive.reactor.instrument.ReactorInstrumentation.lambda$init$0(ReactorInstrumentation.java:62) ~[micronaut-runtime-3.9.2.jar:3.9.2]
            at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:84) ~[reactor-core-3.5.0.jar:3.5.0]
            at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:37) ~[reactor-core-3.5.0.jar:3.5.0]
            at io.micronaut.scheduling.instrument.InvocationInstrumenterWrappedCallable.call(InvocationInstrumenterWrappedCallable.java:53) ~[micronaut-context-3.9.2.jar:3.9.2]
            at java.util.concurrent.FutureTask.run(FutureTask.java:317) ~[?:?]
            at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
            at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    2023-06-30 15:24:19 ESC[1;31mERRORESC[m i.a.s.a.ApiHelper(execute):40 - Unexpected Exception
    software.amazon.awssdk.services.s3.model.NoSuchBucketException: The specified bucket does not exist (Service: S3, Status Code: 404, Request ID: 176D792AC02C652A, Extended Request ID: dd9025bab4ad464b04917
    7c95eb6ebf374d3b3fd1af9251148b658df7ac2e3e8)
            at software.amazon.awssdk.protocols.xml.internal.unmarshall.AwsXmlPredicatedResponseHandler.handleErrorResponse(AwsXmlPredicatedResponseHandler.java:156) ~[aws-xml-protocol-2.20.65.jar:?]
            at software.amazon.awssdk.protocols.xml.internal.unmarshall.AwsXmlPredicatedResponseHandler.handleResponse(AwsXmlPredicatedResponseHandler.java:108) ~[aws-xml-protocol-2.20.65.jar:?]
            at software.amazon.awssdk.protocols.xml.internal.unmarshall.AwsXmlPredicatedResponseHandler.handle(AwsXmlPredicatedResponseHandler.java:85) ~[aws-xml-protocol-2.20.65.jar:?]
            at software.amazon.awssdk.protocols.xml.internal.unmarshall.AwsXmlPredicatedResponseHandler.handle(AwsXmlPredicatedResponseHandler.java:43) ~[aws-xml-protocol-2.20.65.jar:?]
            at software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler$Crc32ValidationResponseHandler.handle(AwsSyncClientHandler.java:95) ~[aws-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.handler.BaseClientHandler.lambda$successTransformationResponseHandler$7(BaseClientHandler.java:270) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.HandleResponseStage.execute(HandleResponseStage.java:40) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.HandleResponseStage.execute(HandleResponseStage.java:30) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptTimeoutTrackingStage.execute(ApiCallAttemptTimeoutTrackingStage.java:73) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptTimeoutTrackingStage.execute(ApiCallAttemptTimeoutTrackingStage.java:42) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.TimeoutExceptionHandlingStage.execute(TimeoutExceptionHandlingStage.java:78) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.TimeoutExceptionHandlingStage.execute(TimeoutExceptionHandlingStage.java:40) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptMetricCollectionStage.execute(ApiCallAttemptMetricCollectionStage.java:50) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptMetricCollectionStage.execute(ApiCallAttemptMetricCollectionStage.java:36) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.RetryableStage.execute(RetryableStage.java:81) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.RetryableStage.execute(RetryableStage.java:36) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.StreamManagingStage.execute(StreamManagingStage.java:56) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.StreamManagingStage.execute(StreamManagingStage.java:36) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.executeWithTimer(ApiCallTimeoutTrackingStage.java:80) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.execute(ApiCallTimeoutTrackingStage.java:60) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.execute(ApiCallTimeoutTrackingStage.java:42) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallMetricCollectionStage.execute(ApiCallMetricCollectionStage.java:48) ~[sdk-core-2.20.65.jar:?]
            at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallMetricCollectionStage.execute(ApiCallMetricCollectionStage.java:31) ~[sdk-core-2.20.65.jar:?]
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    06/30/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 At 1pm PDT click here to join us on Zoom!
  • b

    brenda sofia

    06/30/2023, 7:53 PM
    Hello! I am trying to set up an S3 destination but I am getting this error message:
    Copy code
    Configuration check failed
    Could not connect to the S3 bucket with the provided configuration. <YOUR-AKID>/YYYYMMDD/REGION/SERVICE/aws4_request". (Service: Amazon S3; Status Code: 400; Error Code: AuthorizationHeaderMalformed; Request ID: Q0FH7TT4GYZC20SW; S3 Extended Request ID: phAERC6BZQ+tY3kuZHTuVQ5Nyp/DAX4+qol1v7J2/vNPOn4yUCQey+dSI021AT+JWacnHkYWCAI=; Proxy: null)
    According to AWS, seems like the error code 400 is because my bucket is not in the region I am specifying but I'm sure my bucket is in us-west-2 which is the same region I am selecting in my airbyte destination.
    k
    • 2
    • 2
  • k

    kylashpriya NA

    06/30/2023, 11:11 PM
    Hello I have weird s3 source connector issues struggling since few days. My requirement is pretty simple. I have self-hosted airbyte in EKS aws. I wanted to move data from a s3 bucket/inbound/reports/fama/YYYYMMDDHHSSMMNNNN_fama_report.csv pattern and I want to incrementally load them into my snowflake. • Airbyte : 0.40.26 • s3 connector : 0.1.27 S3 connections : Path Prefix=inbound/reports/fama/ Pattern of files to replicate=“**/*Families_report.csv” Doesn’t work when trying to setup connections. I see only default coulmns and not my files columns. I have tried all below combinations
    Copy code
    path_prefix = , path_pattern = inbound/reports/fama/*_fama_report.csv
    Error
    
    path_prefix = , path_pattern = inbound/reports/fama/20221129161802_fama_report.csv
    Error
    
    path_prefix = , path_pattern =bm-production-bi-yieldigo/inbound/reports/fama/20221129161802_fama_report.csv
    Error
    
    path_prefix =inbound/reports/fama/  path_pattern =20221129161802_fama_report.csv
    Error
    
    path_prefix =inbound/reports/fama/  path_pattern =**
    Error
    
    path_prefix =/inbound/reports/fama/  path_pattern =**
    Error
    
    path_prefix =/inbound/reports/fama/  path_pattern =**/*.csv
    Error
    
    path_prefix =/inbound/reports/fama/  path_pattern =**/20221129161802_fama_report.csv
    SUCCESS but no files
    
    path_prefix =/inbound/reports/fama  path_pattern =20221129161802_fama_report.csv
    SUCCESS but no files
    
    
    path_prefix =inbound/reports/fama/  path_pattern =20221129161802_fama_report.csv
    SUCCESS but no files
    
    
    path_prefix =inbound/reports/fama/  path_pattern =**/20221129161802_fama_report.csv
    SUCCESS but no files
    
    
    path_prefix =  path_pattern =inbound/reports/fama/20221129161802_fama_report.csv
    SUCCESS but no files
    
    path_prefix =  path_pattern =/inbound/reports/fama/20221129161802_fama_report.csv
    SUCCESS but no files
    
    
    path_prefix =  path_pattern =bm-production-bi-yieldigo/inbound/reports/fama/20221129161802_fama_report.csv
    SUCCESS but no files
    
    path_prefix =inbound/reports/fama/   path_pattern =**/*.csv
    Error
    
    
    path_prefix =  path_pattern =inbound/reports/fama/**/*.csv
    Error
    
    path_prefix =  path_pattern =inbound/reports/fama/**/20221129161802_fama_report.csv
    works
    
    path_prefix =  path_pattern =inbound/reports/fama/**/*.csv
    Error
    
    path_prefix =  path_pattern =inbound/reports/fama/**
    Error
    
    path_prefix =  path_pattern =inbound/reports/fama/**/
    Error
    
    path_prefix =  path_pattern =inbound/reports/fama/**/**/*.csv
    Error
    
    path_prefix =inbound/reports/fama/**/  path_pattern =20221129161802_fama_report.csv
    Works
    
    path_prefix =inbound/reports/fama/**/  path_pattern =2022*_fama_report.csv
    Works
    
    path_prefix =inbound/reports/fama/**/  path_pattern =*_fama_report.csv
    Works
    
    path_prefix =inbound/reports/fama/**/  path_pattern =*.csv
    Works
    
    path_prefix =inbound/reports/fama/**/  path_pattern =**
    Works
    
    path_prefix =inbound/reports/*/**/  path_pattern =**
    Works
    
    path_prefix =inbound/reports/*/**/  path_pattern =**
    Works
    
    path_prefix =inbound/**/**/**/  path_pattern =**
    Deletes all columns
    
    path_prefix =inbound/*/**/**/ path_pattern =**
    Deletes all columns
    
    path_prefix =inbound/*/**/**/ path_pattern =**/*.csv
    Deletes all columns
    
    path_prefix =inbound/*/*/**/ path_pattern =**/*.csv
    Deletes all columns
    
    path_prefix = path_pattern =inbound/**/*.csv
    Error
    
    path_prefix =inbound/  path_pattern =**/*.csv
    Error
    
    path_prefix =inbound/**  *.csv
    Deletes all columns
    
    path_prefix =inbound/**/  *.csv
    Deletes all columns
    
    path_prefix =inbound/**/  **/*.csv
    Deletes all columns
    
    
    path_prefix =inbound/reports/fama/**  path_pattern =*.csv
    Works
    
    path_prefix =inbound/reports/**  path_pattern =*.csv
    Deletes all columns
    
    path_prefix = , path_pattern=inbound/reports/fama/**/
    Error
    
    
    path_prefix = , path_pattern =inbound/reports/fama/*.csv
    error
    
    
    
    Path Prefix=inbound/reports/fama/  Pattern of files to replicate=**
    but none of them seems to work. Could someone please help me here? Apologies if its super simple mistake!! Thanks in advance!
    k
    • 2
    • 2
  • h

    Haim Beyhan

    07/01/2023, 7:57 AM
    I am always getting grpc errors about message size. I tried to update the values and restarted the temporal, worker and server but still same error. io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: grpc: received message larger than max (9765003 vs. 4194304) limit.blobSize.error: - value: 62914560 # 60MB constraints: {} limit.blobSize.warn: - value: 52428800 # 50MB constraints: {}
    k
    • 2
    • 2
  • m

    Matheus

    07/02/2023, 6:05 PM
    I keep getting failed sync attempts between Instagram and Postgres, does anyone know if it is a setup issue something with the connectors? Failure reason: 'end_time' Log attached
    e1d4cb08_b333_4185_b39a_770557f59139_job_15_attempt_3_txt.txt
  • m

    maddu kiran

    07/03/2023, 5:18 AM
    Hi everyone Does anyone else faced this issue with Google Sheets? It is showing "Server not available", "some this went wrong" etc. and after sometime getting connected automatically. Disconnects again, sync fails. If anyone faced this issue, please let me know how you resolved it.
  • e

    Ekansh Verma

    07/03/2023, 6:46 AM
    Hi team! My source dyanmodb source connector syncs 1.2 GB of data and fails due to OOM issues on heap size of 2GB. Anyone facing the same issue? What could be a possible reason for this? Is this because of use of complex data structures such as list of dictionaries in java??
    k
    • 2
    • 2
  • k

    Katja WiesmĂĽller

    07/03/2023, 9:52 AM
    Hello everyone, I am using the Salesforce connector to load data into BigQuery. The sync is done every 15 min and all performed syncs had the status "sync succseeded", but the last entry in BigQuery is dated 06/28/2023. The connector is up to date. Any tips? @Daniel Pietschmann @Julian Felix Rost @Florian Klompmaker @Jan Malte Behrens
    k
    • 2
    • 2
  • h

    Haki Dere

    07/03/2023, 10:19 AM
    HI, We are getting this error while trying to create a connection between Oracle db and Bigquery, what would be the preffered solution here? @kapa.ai
    Copy code
    2023-07-03 10:15:36 [32mINFO[m i.a.a.c.AirbyteApiClient(retryWithJitterThrows):234 - Attempt 2 to call to write discover schema result error: io.airbyte.api.client.invoker.generated.ApiException: writeDiscoverCatalogResult call failed with: 413 - {"message":"Request Entity Too Large","_links":{"self":{"href":"/api/v1/sources/write_discover_catalog_result","templated":false}},"_embedded":{"errors":[{"message":"The content length [13860158] exceeds the maximum allowed content length [10485760]"}]}}
    k
    f
    • 3
    • 8
  • d

    Dhrubark Sarmah

    07/03/2023, 10:26 AM
    Hi guys, I am hosting Airbyte on EC2 , setting up source works fine but connection to destination throws error. What should i do to solve this?
    check_connection_destination-failure-e6263e01-aa8f-45e6-8df9-b91cb877d7c3.txt
    h
    • 2
    • 7
  • v

    Vahid

    07/03/2023, 12:17 PM
    This message contains interactive elements.
    check_connection_source-failure-76fd2d1d-8b2e-431c-bab0-fc6f82bf7bb5.txt
  • j

    Jesus Rivero

    07/03/2023, 4:09 PM
    Hi, i facing an issue with octavia when i try to apply a destination. It throw the following error: Could not find configuration for STANDARD_DESTINATION_DEFINITION
    k
    • 2
    • 2
  • d

    Diego Barros

    07/03/2023, 7:08 PM
    Hello guys, i need a help with my connection SQL Server.
    k
    • 2
    • 2
  • d

    Diego Barros

    07/03/2023, 7:08 PM
    Description(Points): 1 - I have a SQL Server conection Airbyte on cloud that run every day at 05:00 AM 2 -The first sync(using JDBC) works 100% 3 - The second sync(using change data capture) works 100%
    k
    • 2
    • 2
  • d

    Diego Barros

    07/03/2023, 7:09 PM
    250579486-e084fba6-e228-410f-b359-18269e7c6048.png
  • d

    Diego Barros

    07/03/2023, 7:09 PM
    4 - From now nothing has updated more, but i have a fake positive execution status.
  • d

    Diego Barros

    07/03/2023, 7:10 PM
    I don't now if this problem is because my table has a biggest volume of data(283 GB) or because my Airbyte isnt supporting collect the CDC logs data
  • d

    Diego Barros

    07/03/2023, 7:10 PM
    Looking for the job log i see a message: Producer failure com.microsoft.sqlserver.jdbc.SQLServerException: The result set is closed.
  • d

    Diego Barros

    07/03/2023, 7:10 PM
    and other message: The main thread is exiting while children non-daemon threads from a connector are still active.
  • d

    Diego Barros

    07/03/2023, 7:11 PM
    Just complementing, i have many others conections using CDC and working. Does anyone know what must be the cause of this? I appreciate any help!
    k
    • 2
    • 2
  • d

    Danilo Drobac

    07/03/2023, 7:45 PM
    Google Analytics (GA4) session data does not match the UI data when I export using Airbyte. I'm using a custom report (which includes Default Channel Group) because none of the standard ones include that. My Custom report definition is:
    {"name": "Traffic acquisition", "dimensions": ["date", "defaultChannelGroup"], "metrics": ["sessions"]}
    Am I missing something obvious?
    k
    • 2
    • 2
  • o

    Octavia Squidington III

    07/03/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1pm PDT click here to join us on Zoom!
  • e

    Evan Wang

    07/03/2023, 10:23 PM
    Hello! I have a question with running a docker command that uses Airbyte to ingest from Salesforce all within a KubernetesPodOperator in Airflow: https://stackoverflow.com/questions/76608392/using-a-kubernetespodoperator-to-run-a-docker-command-that-uses-airbytes-salesf Please let me know if there is a more preferred method/forum to post this. Thanks!
    k
    • 2
    • 3
1...208209210...245Latest