https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Arthur Ho

    06/13/2023, 7:37 AM
    Hello ! Sorry if that's not the right channel ! I'm using a self hosted version of Airbyte with a few dozen of live connections and Slack sync notifications activated. However, believe it would be really great to have the "name" of the sync that failed in the notification message content. It would make the actionability of the notification much better. For instance the content of a message :
    Copy code
    Your connection from Postgres to BigQuery just failed...
    This happened with sync started on Monday, June 12, 2023 at 12:58:41 PM Coordinated Universal Time, running for 1 minute 20 seconds, as the Job was cancelled.
    You can access its logs here: <http://airbyte-airbyte-webapp-svc:80/workspaces/679da635-0911-4722-91c2-62d4170fe629/connections/6a4ea414-db91-40f4-a472-5aa6c9c7df9c>
    Job ID: 152
    Really happy to post this message somewhere else if needed. Have a great day !
    k
    e
    • 3
    • 3
  • i

    Ignacio Martínez de Toda

    06/13/2023, 7:46 AM
    Hi there, I’m trying to use MongoDB as a source to stream data to Bigquery but it seems the source connection for MongoDB has issues, i keep getting the same error Failed to fetch schema or Non-json response which makes my stream to fail and also the VM on GCP where i’ve deployed airbyte to crash so i have to restart the VM and run it again all the time and I can’t rely on cron jobs to automatically sync my db as it will crash eventually and make the VM stop working. Can someone assist here?
    🙏 1
    k
    • 2
    • 2
  • i

    Indar

    06/13/2023, 11:57 AM
    I'm sorry to hear that you're experiencing issues with the MongoDB source connection in Airbyte and the impact it's having on your GCP VM. Here are some steps you can take to troubleshoot and address the problem: 1. Verify MongoDB Connection: Double-check that your MongoDB connection details (e.g., hostname, port, authentication credentials) are correct. Ensure that the MongoDB server is accessible from the VM where Airbyte is deployed. You can test the connection using command-line tools like
    mongo
    or a MongoDB client. 2. Check Airbyte Configuration: Review your Airbyte configuration file for the MongoDB source connector. Ensure that the JSON schema for the source is correctly defined, including the proper field types and structures. Any inconsistencies in the schema can result in errors when fetching data. 3. Check MongoDB Data: Ensure that the MongoDB collection you are trying to stream data from has valid and well-formed JSON documents. If there are any malformed documents or unexpected data types, it can cause the "Non-json response" error. You may need to perform data cleaning or transformation within MongoDB to ensure the data is in the expected format. 4. Update Airbyte and MongoDB Connectors: Make sure you are using the latest versions of Airbyte and the MongoDB source connector. Check the Airbyte documentation and the GitHub repository for any known issues or bug fixes related to MongoDB connectivity. Updating to the latest versions might resolve the problem. 5. Increase Resource Allocation: If the GCP VM is crashing due to resource limitations, consider allocating more resources to the VM. This can involve increasing the VM's memory, CPU, or disk space to handle the processing and streaming requirements. 6. Check Logs and Error Messages: Examine the logs and error messages generated by Airbyte to get more detailed information about the issue. Look for any specific error codes or stack traces that can provide insights into the root cause. 7. Engage with Airbyte Community: Reach out to the Airbyte community through their official documentation, GitHub repository, or community forums. Describe your issue in detail and provide relevant logs/error messages. The community members and developers can help troubleshoot the problem and provide specific guidance. By following these steps and leveraging the resources available within the Airbyte community, you should be able to identify and resolve the issues you are facing with the MongoDB source connection and the stability of your GCP VM running Airbyte.
    k
    i
    • 3
    • 4
  • t

    Thiago Villani

    06/13/2023, 12:09 PM
    Hello, with airbyte is it possible to sync several .pdf files?, what would be the source to be used?
    k
    • 2
    • 2
  • n

    Nazif Ishrak

    06/13/2023, 1:03 PM
    Hello, I am currently facing a challenge with a full refresh sync operation from my source database to my destination database. I understand that the batch size for these operations isn’t directly configurable, and as a result, the process seems to be attempting to extract all the data at once, which subsequently leads to a crash due to memory overload. I am interested in exploring possible solutions to this issue. Could you kindly guide me on how I could perform a full refresh sync operation in a paginated or chunked manner? This way, the memory burden would potentially be reduced, thereby preventing the process from crashing. Any guidance on this would be greatly appreciated.
    k
    • 2
    • 3
  • r

    Richa Rochna

    06/13/2023, 1:29 PM
    Hi Team, I am facing below errors. The connections, sources and destination screen won’t load due to this error on UI. Using airbyte version - 0.41.0. We don't have this column in connection.notify_schema_changes. Same version was working fine earlier. Can you please help!
    2023-06-13 13:17:51 ERROR i.a.s.e.UncaughtExceptionHandler(handle):28 - Uncaught exception
    org.jooq.exception.DataAccessException: SQL [select * from "public"."connection" where "public"."connection"."id" = cast(? as uuid)]; Error while reading field: "public"."connection"."notify_schema_changes", at JDBC index: 19
    Caused by: java.sql.SQLException: Error while reading field: "public"."connection"."notify_schema_changes", at JDBC index: 19
    Caused by: org.postgresql.util.PSQLException: Cannot cast to boolean: "ignore"
    k
    • 2
    • 3
  • o

    Octavia Squidington III

    06/13/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom!
  • n

    Nazif Ishrak

    06/13/2023, 3:34 PM
    How long is the office hours?
    k
    • 2
    • 2
  • n

    Nohelia Merino

    06/13/2023, 5:34 PM
    Does Airbyte have some sort of async mechanism to check connection status after creation? I'm looking to implement this whenever the connection takes too much time and there is a need to retry it automatically. @kapa.ai
    k
    l
    • 3
    • 4
  • c

    Clinton Berry

    06/13/2023, 8:20 PM
    I'm trying to understand how some of these database sources work and I am missing something. When trying to use the mssql source, I set the SSL config to disabled and it still is trying to use SSL. I noticed in the source code there is two sources for many of the databases. mssql and mssql-strict-encrypt. But only one source shows up in the interface. The changelog also has a section for "strict encrypt". Is there a chance it is using that one? How can I tell which source it is using?
    k
    • 2
    • 3
  • n

    Nipuna Prashan

    06/14/2023, 12:59 AM
    I deployed Airbyte. Airbyte pod-sweeper giving following error. couldn't get current server API group list: Get "http://localhost:8080/api?timeout=32s": dial tcp [:1]8080: connect: connection refused
    k
    • 2
    • 2
  • n

    Nipuna Prashan

    06/14/2023, 3:53 AM
    podsweeper cannot connect to kubernetes API server
    k
    • 2
    • 2
  • i

    Indar

    06/14/2023, 4:55 AM
    The error message you're encountering suggests that the Airbyte pod-sweeper component is unable to connect to the server's API endpoint. This error commonly occurs when the API server is not running or is inaccessible. Here are a few steps you can take to troubleshoot and resolve the issue: 1. Check Kubernetes API Server: Verify that the Kubernetes API server is running and accessible. You can check the status of the API server by running the following command:
    Copy code
    kubectl cluster-info
    Ensure that the API server is listed and running without any errors. If there are issues with the API server, you may need to investigate and resolve them before the pod-sweeper can establish a connection. 2. Check Pod-Sweeper Configuration: Ensure that the pod-sweeper component is configured correctly. Check the configuration files or environment variables related to the pod-sweeper component and ensure that they match your cluster's setup. Pay attention to the API server URL and any authentication or authorization settings that may be required. 3. Verify Network Connectivity: Confirm that there is network connectivity between the machine running the pod-sweeper and the Kubernetes API server. Ensure that there are no network restrictions or firewalls blocking the communication between the two. 4. Check Pod-Sweeper Logs: Review the logs of the pod-sweeper component to gather more information about the error. The logs may provide additional details or error messages that can help identify the root cause. You can retrieve the logs using the following command:
    Copy code
    kubectl logs <pod-sweeper-pod-name>
    Replace
    <pod-sweeper-pod-name>
    with the actual name of the pod-sweeper pod. 5. Ensure Correct Namespace: If you have multiple namespaces in your Kubernetes cluster, verify that the pod-sweeper component is deployed in the correct namespace where the Airbyte deployment resides. The pod-sweeper needs to be deployed in the same namespace as Airbyte to access the relevant resources. 6. Reach Out to Airbyte Community: If the issue persists, consider reaching out to the Airbyte community for further assistance. Check their documentation, GitHub repository, or community forums to report the issue and seek guidance from other users or developers who may have encountered a similar problem. By following these steps and investigating the underlying causes, you should be able to troubleshoot and resolve the connection issue with the Airbyte pod-sweeper component.
    k
    • 2
    • 2
  • m

    Mikhail Masyagin

    06/14/2023, 7:26 AM
    Hello! I'm having these warning on each connection. How can I avoid them??
    Copy code
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:206) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:179) ~[temporal-sdk-1.17.0.jar:?]
    	at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.17.0.jar:?]
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    2023-06-14 07:12:51 WARN i.t.i.a.ActivityTaskExecutors$BaseActivityTaskExecutor(execute):114 - Activity failure. ActivityId=56274297-7f99-3c25-ae80-949e9b8ac6cd, activityType=RecordWorkflowCountMetric, attempt=5
    io.airbyte.commons.temporal.exception.RetryableException: io.airbyte.api.client.invoker.generated.ApiException: getWorkspaceByConnectionId call failed with: 500 - {"message":"Internal Server Error: io.airbyte.config.persistence.ConfigNotFoundException: config type: STANDARD_WORKSPACE id: f4e4619c-da4f-480f-9f02-a10f509e44bb","exceptionClassName":"java.lang.RuntimeException","exceptionStack":["java.lang.RuntimeException: io.airbyte.config.persistence.ConfigNotFoundException: config type: STANDARD_WORKSPACE id: f4e4619c-da4f-480f-9f02-a10f509e44bb","\tat io.airbyte.config.persistence.ConfigRepository.getStandardWorkspaceFromConnection(ConfigRepository.java:452)","\tat io.airbyte.commons.server.handlers.WorkspacesHandler.getWorkspaceByConnectionId(WorkspacesHandler.java:188)","\tat io.airbyte.server.apis.WorkspaceApiController.lambda$getWorkspaceByConnectionId$7(WorkspaceApiController.java:124)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:26)","\tat io.airbyte.server.apis.WorkspaceApiController.getWorkspaceByConnectionId(WorkspaceApiController.java:124)","\tat io.airbyte.server.apis.$WorkspaceApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:371)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.InternalFluxOperator.subscribe(InternalFluxOperator.java:62)","\tat reactor.core.publisher.FluxSubscribeOn$SubscribeOnSubscriber.run(FluxSubscribeOn.java:194)","\tat io.micronaut.reactive.reactor.instrument.ReactorInstrumentation.lambda$init$0(ReactorInstrumentation.java:62)","\tat reactor.core.scheduler.WorkerTask.call(WorkerTask.java:84)","\tat reactor.core.scheduler.WorkerTask.call(WorkerTask.java:37)","\tat io.micronaut.scheduling.instrument.InvocationInstrumenterWrappedCallable.call(InvocationInstrumenterWrappedCallable.java:53)","\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)","\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)","\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)","\tat java.base/java.lang.Thread.run(Thread.java:1589)","Caused by: io.airbyte.config.persistence.ConfigNotFoundException: config type: STANDARD_WORKSPACE id: f4e4619c-da4f-480f-9f02-a10f509e44bb","\tat io.airbyte.config.persistence.ConfigRepository.lambda$getStandardWorkspaceNoSecrets$1(ConfigRepository.java:232)","\tat java.base/java.util.Optional.orElseThrow(Optional.java:403)","\tat io.airbyte.config.persistence.ConfigRepository.getStandardWorkspaceNoSecrets(ConfigRepository.java:232)","\tat io.airbyte.config.persistence.ConfigRepository.getStandardWorkspaceFromConnection(ConfigRepository.java:450)","\t... 22 more"],"rootCauseExceptionClassName":"java.lang.Class","rootCauseExceptionStack":["io.airbyte.config.persistence.ConfigNotFoundException: config type: STANDARD_WORKSPACE id: f4e4619c-da4f-480f-9f02-a10f509e44bb","\tat io.airbyte.config.persistence.ConfigRepository.lambda$getStandardWorkspaceNoSecrets$1(ConfigRepository.java:232)","\tat java.base/java.util.Optional.orElseThrow(Optional.java:403)","\tat io.airbyte.config.persistence.ConfigRepository.getStandardWorkspaceNoSecrets(ConfigRepository.java:232)","\tat io.airbyte.config.persistence.ConfigRepository.getStandardWorkspaceFromConnection(ConfigRepository.java:450)","\tat io.airbyte.commons.server.handlers.WorkspacesHandler.getWorkspaceByConnectionId(WorkspacesHandler.java:188)","\tat io.airbyte.server.apis.WorkspaceApiController.lambda$getWorkspaceByConnectionId$7(WorkspaceApiController.java:124)","\tat io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:26)","\tat io.airbyte.server.apis.WorkspaceApiController.getWorkspaceByConnectionId(WorkspaceApiController.java:124)","\tat io.airbyte.server.apis.$WorkspaceApiController$Definition$Exec.dispatch(Unknown Source)","\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:371)","\tat io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594)","\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303)","\tat io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111)","\tat io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103)","\tat io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659)","\tat reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49)","\tat reactor.core.publisher.InternalFluxOperator.subscribe(InternalFluxOperator.java:62)","\tat reactor.core.publisher.FluxSubscribeOn$SubscribeOnSubscriber.run(FluxSubscribeOn.java:194)","\tat io.micronaut.reactive.reactor.instrument.ReactorInstrumentation.lambda$init$0(ReactorInstrumentation.java:62)","\tat reactor.core.scheduler.WorkerTask.call(WorkerTask.java:84)","\tat reactor.core.scheduler.WorkerTask.call(WorkerTask.java:37)","\tat io.micronaut.scheduling.instrument.InvocationInstrumenterWrappedCallable.call(InvocationInstrumenterWrappedCallable.java:53)","\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)","\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)","\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)","\tat java.base/java.lang.Thread.run(Thread.java:1589)"]}
    	at io.airbyte.workers.temporal.scheduling.activities.RecordMetricActivityImpl.getWorkspaceId(RecordMetricActivityImpl.java:118) ~[io.airbyte-airbyte-workers-0.44.5.jar:?]
    	at io.airbyte.workers.temporal.scheduling.activities.$RecordMetricActivityImpl$Definition$Intercepted.$$access$$getWorkspaceId(Unknown Source) ~[io.airbyte-airbyte-workers-0.44.5.jar:?]
    	at io.airbyte.workers.temporal.scheduling.activities.$RecordMetricActivityImpl$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-workers-0.44.5.jar:?]
    	at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:371) ~[micronaut-inject-3.9.1.jar:3.9.1]
    	at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:128) ~[micronaut-aop-3.9.1.jar:3.9.1]
    	at io.micronaut.cache.interceptor.CacheInterceptor.doContextProceed(CacheInterceptor.java:736) ~[micronaut-cache-core-3.5.0.jar:3.5.0]
    	at io.micronaut.cache.interceptor.CacheInterceptor.doProceed(CacheInterceptor.java:740) ~[micronaut-cache-core-3.5.0.jar:3.5.0]
    	at io.micronaut.cache.interceptor.CacheInterceptor.interceptSync(CacheInterceptor.java:430) ~[micronaut-cache-core-3.5.0.jar:3.5.0]
    	at io.micronaut.cache.interceptor.CacheInterceptor.intercept(CacheInterceptor.java:156) ~[micronaut-cache-core-3.5.0.jar:3.5.0]
    	at io.micronaut.aop.chain.MethodInterceptorChain.proceed(MethodInterceptorChain.java:137) ~[micronaut-aop-3.9.1.jar:3.9.1]
    	at io.airbyte.workers.temporal.scheduling.activities.$RecordMetricActivityImpl$Definition$Intercepted.getWorkspaceId(Unknown Source) ~[io.airbyte-airbyte-workers-0.44.5.jar:?]
    	at io.airbyte.workers.temporal.scheduling.activities.RecordMetricActivityImpl.generateTags(RecordMetricActivityImpl.java:99) ~[io.airbyte-airbyte-workers-0.44.5.jar:?]
    	at io.airbyte.workers.temporal.scheduling.activities.RecordMetricActivityImpl.recordWorkflowCountMetric(RecordMetricActivityImpl.java:61) ~[io.airbyte-airbyte-workers-0.44.5.jar:?]
    	at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[?:?]
    k
    • 2
    • 2
  • n

    Nazif Ishrak

    06/14/2023, 9:23 AM
    Can someone tell if this is accurate or not “Airbyte uses a process called batch processing where it divides the data into manageable chunks, commonly referred to as “batches.” The batch size for Airbyte is usually set at 10,000 records, meaning it will try to load 10,000 records at a time. If the total number of records exceeds 10,000, Airbyte doesn’t stop or crash. Instead, it will continue to process the data in batches. For instance, if there are 30,000 records, Airbyte will process them in three separate batches of 10,000 each. However, if a single batch exceeds the memory capacity because the individual records are too large, it may result in an out-of-memory error. This issue isn’t directly related to the number of records but is about the size of the data. If you encounter such issues, it might be necessary to optimize your data, adjust your system resources, or modify the configuration settings to better accommodate your specific use case.”
    k
    • 2
    • 3
  • n

    Nazif Ishrak

    06/14/2023, 9:32 AM
    Can someone tell if this is accurate or not “Airbyte uses a process called batch processing where it divides the data into manageable chunks, commonly referred to as “batches.” The batch size for Airbyte is usually set at 10,000 records, meaning it will try to load 10,000 records at a time. If the total number of records exceeds 10,000, Airbyte doesn’t stop or crash. Instead, it will continue to process the data in batches. For instance, if there are 30,000 records, Airbyte will process them in three separate batches of 10,000 each. However, if a single batch exceeds the memory capacity because the individual records are too large, it may result in an out-of-memory error. This issue isn’t directly related to the number of records but is about the size of the data. If you encounter such issues, it might be necessary to optimize your data, adjust your system resources, or modify the configuration settings to better accommodate your specific use case. Is this even true for full refresh sync? So basically the number of rows doesnt matter? Only the size of each row is a problem?”
    k
    • 2
    • 2
  • c

    Clément Salaün

    06/14/2023, 10:42 AM
    Hey there 👋 Not a “connection issue” per se but more of a general question re connectors and cdk: is there a list of people somewhere that are open to work for developing Airbyte connectors / a way to request connectors and pledge a bounty? I’m interested in adding a properly implemented, official Airbyte connector to our platform and inclined to have it developed by someone within the community that already knows the drill instead of shopping for someone on general purpose freelancing platforms
    k
    • 2
    • 2
  • a

    Aiman

    06/14/2023, 12:36 PM
    Is it any another option from Airbyte in Airflow using
    AirbyteJobSensor
    if I'm using Python code from scratch via API? I need for my ETL improvement, thank you
    k
    • 2
    • 3
  • l

    Luis Vicente

    06/14/2023, 1:12 PM
    Can you setup a transformation using the Airbyte APi?
    k
    • 2
    • 5
  • g

    George Myrianthous

    06/14/2023, 1:19 PM
    Hi everyone, I wanted to ask if it’s possible to change the field used to partition the destination BigQuery table. We use the Mixpanel source and by default the partitioning field on the BQ table is
    _airbyte_emitted_at
    however we would want to change it to a different field. Is this possible?
    k
    • 2
    • 3
  • l

    Luis Vicente

    06/14/2023, 1:24 PM
    The Airbyte API provided in the cloud version doesn't support the creation of Operations (normalisation, dbt or webhook). Is this going to be added?
    k
    • 2
    • 2
  • p

    Pedro Doria

    06/14/2023, 1:27 PM
    Good morning everybody. I'm in a new type of occurrence with Airbyte, all connections from a given source being disabled by themselves. (this has happened 3x in the last week, before that it hasn't happened even once) Has anyone experienced this?
    k
    k
    • 3
    • 3
  • p

    Peter

    06/14/2023, 4:34 PM
    Hi everyone. does anyone knows if there is any issue with the sync notifications on the self hosted airbyte instance ? I've the notifications settings connected correctly (I can receive the notification when I run the test), however, I'm not receiving any notification after a connection ran, even failure notifications.
    k
    l
    +5
    • 8
    • 30
  • a

    Aman Kesharwani

    06/14/2023, 5:18 PM
    Hello!, I am having trouble setting up S3 source connection , getting below access denied error, The same bucket I am using in S3 destination connection which successfully works, I am using EC2 IAM instance profile for connection, Please note that the bucket does not have permission but the airbyte folder inside the bucket has full permission attached to instance profile Attaching the snapshot of connection
    Copy code
    2023-06-14 17:11:37 INFO i.a.w.i.VersionedAirbyteStreamFactory(internalLog):312 - Iterating S3 bucket 'xxxxxxxx' with prefix: 'airbyte' 
    2023-06-14 17:11:37 ERROR i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - Traceback (most recent call last):
      File "/airbyte/integration_code/source_s3/source_files_abstract/source.py", line 63, in check_connection
        for file_info in stream.filepath_iterator():
      File "/airbyte/integration_code/source_s3/stream.py", line 56, in filepath_iterator
        response = client.list_objects_v2(**kwargs)
      File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 530, in _api_call
        return self._make_api_call(operation_name, kwargs)
      File "/usr/local/lib/python3.9/site-packages/botocore/client.py", line 964, in _make_api_call
        raise error_class(parsed_response, operation_name)
    botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
    2023-06-14 17:11:37 ERROR i.a.w.i.VersionedAirbyteStreamFactory(internalLog):308 - Check failed
    2023-06-14 17:11:38 INFO i.a.w.g.DefaultCheckConnectionWorker(run):115 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@3a320cd8[status=failed,message=ClientError('An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied'),additionalProperties={}]
    l
    • 2
    • 10
  • o

    Octavia Squidington III

    06/14/2023, 7:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1PM PDT click here to join us on Zoom!
  • j

    Joe

    06/14/2023, 8:09 PM
    Hello, when doing an incremental sync from Oracle -> Snowflake i noticed some records were being skipped when they were inserted into the source db at the same time the Airbyte sync was happening. I see in the Changelog there may be a fix for this... But for some reason the Version number was left blank in the documentation (see image below)? Was this fix included in 0.3.21 or .22 or never? Thanks. https://docs.airbyte.com/integrations/sources/oracle/
  • a

    Arun Singh

    06/14/2023, 9:19 PM
    I am having issues in connecting MongoDB to Kafka, it states
    The form is invalid. Please make sure that all fields are correct.
    How can I fix this.
  • d

    Deepank Agarwal

    06/15/2023, 4:32 AM
    Hello, I'm doing a POC with self-hosted Airbyte application on an EC2 server (t3.medium) to setup a pipeline from our primary database (MongoDb) to the analytics database (Postgres). I've been able to successfully connect both of my databases but when I'm trying to fetch the schema of my source database, I'm repeatedly seeing the following error:
    Copy code
    Unknown error (http.504.iiBiWsjYibLWn3dGRhuYUd)
    My MongoDb database has a large number of collections and documents. Can that be the reason for the error? I tried the exact same process with a new test database with a single collection and it worked perfectly. It would be really helpful if anyone could help me with the reason/solution for the error. Thanks.
    a
    h
    • 3
    • 6
  • s

    Slackbot

    06/15/2023, 4:54 AM
    This message was deleted.
    k
    • 2
    • 2
  • s

    Slackbot

    06/15/2023, 4:55 AM
    This message was deleted.
    k
    • 2
    • 2
1...202203204...245Latest