https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • l

    Lucas Abreu

    04/10/2023, 8:07 PM
    Hi there, small qestion. On the postgres cdc documentation, it says
    Copy code
    The Postgres source performs best on small databases (under 100GB).
    I was wondering how worse are we talking when using it with larger db's, say 1 TB
    k
    • 2
    • 2
  • m

    Masly Alexandra Velasquez Monsalve

    04/10/2023, 8:15 PM
    Hello everybody,šŸ‘‹ • Deployment: I’m using airbyte opensource • Airbyte Version: 0.40.25 • Source name/version: google ads • Destination name/version: s3 • Description: Hi support, I set up a custom GAQL in my google ads source, this is the query that I’m using and I validate it in google ads query builder and it`s ok SELECT customer.id, ad_group.id, ad_group_criterion.criterion_id, ad_group_criterion.status, ad_group_criterion.type, ad_group_criterion.display_name, ad_group_criterion.keyword.match_type, ad_group_criterion.keyword.text, ad_group_criterion.age_range.type, ad_group_criterion.gender.type, ad_group_criterion.income_range.type FROM ad_group_criterion When I run the source test In airbyte it generate this error: ā€œUnable to connect to Google Ads API with the provided configuration - Cannot select or filter on the following segments: ā€˜segments.date’(could not support requested resources: ā€˜AD_GROUP_CRITERION’), since segment is incompatible with the resource in the FROM clause or other selected segmenting resources.ā€ Any idea why this error occurs? Thanks for your help!!!!
    k
    a
    • 3
    • 3
  • k

    Konstantin Lackner

    04/10/2023, 9:27 PM
    Feedback regarding Google Analytics 4 (GA4) source connector: The
    sessionSource
    field of the
    traffic_sources
    stream contains an empty entry (pos. 3 in the table), which makes up ~14% of our total traffic. Thus it's important for us to know which source that is. However, looking at the Google Analytics Dashboard directly, after the "(direct)" traffic source follows "bing", which makes me think that the empty field is a bug of the GA4 connector. Additionally, there is a discrepancy of the number of sessions by traffic source. Airbyte version: 0.42.0 GA4 Source Connector version: 0.1.3 BigQuery Destination version: 1.2.18
  • t

    Trung Luong

    04/10/2023, 10:09 PM
    Hi All, Do you know if Airbyte support DB2 connection for iseries/AS400? We are doing a POC on DB2 but having trouble getting connection. Thanks.
    k
    a
    • 3
    • 3
  • g

    Gonzalo Bottari

    04/10/2023, 11:57 PM
    Hi All! I'm pretty new using Airbyte. I have this issue with MongoDB as source. Source test is passed successfully, but when I want to create a Connections I get this issue.
  • m

    Martin Jung

    04/11/2023, 12:58 AM
    Hey all, with the column selection feature out for Cloud, is there a view to get this feature into the open-source project? Thanks
    k
    • 2
    • 2
  • s

    Sachit Khanna

    04/11/2023, 1:39 AM
    Hey all, can Octavia-CLI image be opened in interactive mode? I am hoping to be able to inspect the image, and also use it as the base layer, adding other tools on top of the container.
    āœ… 1
    k
    • 2
    • 2
  • s

    Srinidhi krishnamurthy

    04/11/2023, 4:09 AM
    Hi Airbyte Team, We are deploying airbyte docker into EC2 version 0.43.1 , while we deploying the webapp is taking avaiallbe port in the machine , is it possible to have custom port configurable so webapp container listens only on say 8080, we can point the LB healthcheck to 8080.
    āœ… 1
    k
    • 2
    • 2
  • u

    UUBOY scy

    04/11/2023, 5:28 AM
    Hi All, I have a source connecctor airbyte/source-mssql:1.0.10, and I'd like to add another connector airbyte/source-mssql:0.1.7 but got the following issue. Anyone have idea how to solve it? I can successfully add a tag later than 0.3.x, but failed to add a tag older than 0.2.x
  • y

    yuan sun

    04/11/2023, 5:46 AM
    Hi All,I want to develop a connector, input the API address or token from the page, the token is optional, after adding the source, you can collect data from the API,But now the error is as follows, how can I solve it:2023-04-11 034436 INFO i.a.w.i.VersionedAirbyteStreamFactory(create):107 - Reading messages from protocol version 0.2.0 2023-04-11 034436 ERROR i.a.c.i.LineGobbler(voidCall):114 - WARNING: The requested image’s platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested 2023-04-11 034502 ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):163 - config: {ā€˜url’: {ā€˜type’: ā€˜string’, ā€˜description’: ā€˜The URL for the API endpoint.’}, ā€˜token’: {ā€˜type’: ā€˜string’, ā€˜description’: ā€˜The API token for authentication.’}} 2023-04-11 034502 ERROR i.a.w.i.DefaultAirbyteStreamFactory(internalLog):163 - config: {ā€˜url’: ā€˜check’, ā€˜token’: ā€˜--config’} 2023-04-11 034502 ERROR i.a.c.i.LineGobbler(voidCall):114 - usage: main.py [-h] {spec,check,discover,read} ... 2023-04-11 034503 ERROR i.a.c.i.LineGobbler(voidCall):114 - main.py: error: argument command: invalid choice: ā€˜source_config.json’ (choose from ā€˜spec’, ā€˜check’, ā€˜discover’, ā€˜read’) 2023-04-11 034503 WARN i.a.w.g.DefaultCheckConnectionWorker(run):107 - Check connection job subprocess finished with exit code 2 2023-04-11 034503 ERROR i.a.w.g.DefaultCheckConnectionWorker(run):124 - Unexpected error while checking connection: io.airbyte.workers.exception.WorkerException: Error checking connection status: no status nor failure reason were outputted at io.airbyte.workers.WorkerUtils.throwWorkerException(WorkerUtils.java:198) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:117) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:41) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.41.0.jar:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] 2023-04-11 034503 INFO i.a.w.t.TemporalAttemptExecution(lambda$getWorkerThread$5):198 - Completing future exceptionally... io.airbyte.workers.exception.WorkerException: Unexpected error while getting checking connection. at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:126) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:41) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.41.0.jar:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] Caused by: io.airbyte.workers.exception.WorkerException: Error checking connection status: no status nor failure reason were outputted at io.airbyte.workers.WorkerUtils.throwWorkerException(WorkerUtils.java:198) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:117) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] ... 3 more 2023-04-11 034503 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-04-11 034503 INFO i.a.c.i.LineGobbler(voidCall):114 - ----- END CHECK ----- 2023-04-11 034503 INFO i.a.c.i.LineGobbler(voidCall):114 - 2023-04-11 034503 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling... 2023-04-11 034503 WARN i.t.i.a.ActivityTaskExecutors$BaseActivityTaskExecutor(execute):114 - Activity failure. ActivityId=a3ce5d78-0e2d-385f-9c6f-1e5863fff461, activityType=RunWithJobOutput, attempt=1 java.util.concurrent.ExecutionException: io.airbyte.workers.exception.WorkerException: Unexpected error while getting checking connection. at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?] at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.get(TemporalAttemptExecution.java:161) ~[io.airbyte-airbyte-workers-0.41.0.jar:?] at io.airbyte.workers.temporal.check.connection.CheckConnectionActivityImpl.runWithJobOutput(CheckConnectionActivityImpl.java:115) ~[io.airbyte-airbyte-workers-0.41.0.jar:?] at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:578) ~[?:?] at io.temporal.internal.activity.RootActivityInboundCallsInterceptor$POJOActivityInboundCallsInterceptor.executeActivity(RootActivityInboundCallsInterceptor.java:64) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.activity.RootActivityInboundCallsInterceptor.execute(RootActivityInboundCallsInterceptor.java:43) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.activity.ActivityTaskExecutors$BaseActivityTaskExecutor.execute(ActivityTaskExecutors.java:95) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.activity.ActivityTaskHandlerImpl.handle(ActivityTaskHandlerImpl.java:92) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handleActivity(ActivityWorker.java:241) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:206) ~[temporal-sdk-1.17.0.jar:? at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:179) ~[temporal-sdk-1.17.0.jar:?] at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.17.0.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1589) ~[?:?] Caused by: io.airbyte.workers.exception.WorkerException: Unexpected error while getting checking connection. at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:126) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:41) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.41.0.jar:?] ... 1 more Caused by: io.airbyte.workers.exception.WorkerException: Error checking connection status: no status nor failure reason were outputted at io.airbyte.workers.WorkerUtils.throwWorkerException(WorkerUtils.java:198) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:117) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.general.DefaultCheckConnectionWorker.run(DefaultCheckConnectionWorker.java:41) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?] at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.41.0.jar:?] ... 1 more
    w
    • 2
    • 3
  • k

    Kristina Ushakova

    04/11/2023, 6:04 AM
    Hello! I would like to raise an issue identical to this archived one https://airbytehq.slack.com/archives/C01MFR03D5W/p1648835023376519?thread_ts=1648767258.626869&cid=C01MFR03D5W. The issue concerns the Sentry reporting integration not working as it should. I went over the setup docs here. The environment variable that is mentioned is Sentry DSN. I tested it locally by sending a zero division error - it works fine. Doesn't look like Airbyte is sending any errors through to Sentry. Looks like two other two env variables needed are:
    SENTRY_ENVIRONMENT=production
    JOB_ERROR_REPORTING_STRATEGY=sentry
    Default values are ā€œā€œ and ā€œloggingā€œ accordingly. From Airbyte env configs in EnvConfigs.java it looks like the strategy defaults to logging:
    Copy code
    @Override
      public JobErrorReportingStrategy getJobErrorReportingStrategy() {
        return getEnvOrDefault(JOB_ERROR_REPORTING_STRATEGY, JobErrorReportingStrategy.LOGGING, s -> {
          try {
            return JobErrorReportingStrategy.valueOf(s.toUpperCase());
          } catch (final IllegalArgumentException e) {
            <http://LOGGER.info|LOGGER.info>(s + " not recognized, defaulting to " + JobErrorReportingStrategy.LOGGING);
            return JobErrorReportingStrategy.LOGGING;
          }
        });
      }
    I may be wrong, not a pro in Java unfortunately. Would be grateful for any help on this issue
    k
    • 2
    • 4
  • t

    Toan Doan

    04/11/2023, 6:35 AM
    Hi guys, I have a question: When is an emitted record get committed to the destination? I have a case where a sync emitted 60k records. It failed midway during the sync. That 60k records did not get committed to the destination (only 1 record is committed). Thank you for reading this!
    k
    w
    • 3
    • 4
  • a

    Anuj Shirgaonkar

    04/11/2023, 7:25 AM
    what is the plan to make airbyte API open source?
    k
    m
    i
    • 4
    • 8
  • v

    vismaya Kalaiselvan

    04/11/2023, 9:29 AM
    Is the new airbyte api available for the oss users as well.It is specified in the below screenshot.
    āž• 3
  • a

    Abdullah Alhabshan

    04/11/2023, 10:41 AM
    Hello, I create a source connection MySQL it works and I try it as a destination connection but didn't work and the error is "Access denied", what is the problem?
    k
    • 2
    • 2
  • s

    Shreepad Khandve

    04/11/2023, 10:55 AM
    Hello team, I have created custom connector and on instance i have uploaded the docker image to get the new connector. Conenctor is working fine in local as well as on interface I can see emitted records as well. but when i look into the schema, the main table is blank. What could be the reason ? is this a version issue or am i missing something ?
  • s

    Sushant

    04/11/2023, 11:05 AM
    I created a API key as per new Airbyte API. Currently I have OSS account but as I created the API key, got an notification that cloud account for 14 days trial is created . I would like to know will I be able to access the Airbyte API after 14 days also,if I use OSS account ?
    k
    • 2
    • 2
  • i

    Ivan Brcko

    04/11/2023, 11:38 AM
    Hi everyone šŸ‘‹ I’m an Airbyte beginner who is trying to create a connection between MongoDB and BigQuery. The source and destination play nicely with each other when the source database is one with a small number of records (150k). When I try to do the same with a lot bigger database (150M+ records) I’m unable to create a connection since it fails during the
    /api/v1/sources/discover_schema
    call with a 502 Bad Gateway response, and an nginx ā€œAn error occurred.ā€ error. While inspecting the logs, I noticed the following error:
    upstream prematurely closed connection while reading response header from upstream, client: 192.168.96.9, server: localhost, request: "POST /api/v1/sources/discover_schema HTTP/1.0", upstream: "<http://192.168.96.4:8001/api/v1/sources/discover_schema>"
    I would really appreciate any help or guidance on where to look next or what to take into consideration. Thanks šŸ™
    k
    • 2
    • 3
  • s

    Shreepad Khandve

    04/11/2023, 12:18 PM
    Hello team, I have created custom connector and on instance i have uploaded the docker image to get the new connector. Conenctor is working fine in local as well as on interface I can see emitted records as well. but when i look into the schema, the main table is blank. What could be the reason ? is this a version issue or am i missing something
    k
    • 2
    • 2
  • t

    Tim Josefsson

    04/11/2023, 12:35 PM
    Hello everyone! I was itching to read up on some of the tutorials offered at https://airbyte.com/tutorials however I'm unable to click/interact with anything below the search bar. Tried with both Chrome and Edge. Am I doing something wrong or is anyone experiencing the same thing?
    āœ… 1
    k
    a
    • 3
    • 5
  • m

    Matej Hamas

    04/11/2023, 12:56 PM
    Hi, I’m Matej from Apify, the software engineer that wrote the Apify Airbyte connector (https://airbyte.com/connectors/apify). Recently, we tried to use it and we saw that it stopped working. Looking at the logs, I can’t see anything relevant, it seems like some internal Airbyte problem to me. Would somebody from the Airbyte team be available for assistance? Many thanks, Matej.
    k
    • 2
    • 2
  • l

    Lalit Kumar Nagwar

    04/11/2023, 1:48 PM
    Hello Team, i am trying to deploy airbyte on aws ec2 insance using "wget https://raw.githubusercontent.com/airbytehq/airbyte-platform/main/{.env,flags.yml,docker-compose.yaml}" command. but i am getting below mentioned error
    Copy code
    --2023-04-11 13:47:49--  <https://raw.githubusercontent.com/airbytehq/airbyte-platform/main/.env>
    Resolving <http://raw.githubusercontent.com|raw.githubusercontent.com> (<http://raw.githubusercontent.com|raw.githubusercontent.com>)... failed: Temporary failure in name resolution.
    wget: unable to resolve host address '<http://raw.githubusercontent.com|raw.githubusercontent.com>'
    --2023-04-11 13:47:49--  <https://raw.githubusercontent.com/airbytehq/airbyte-platform/main/flags.yml>
    Resolving <http://raw.githubusercontent.com|raw.githubusercontent.com> (<http://raw.githubusercontent.com|raw.githubusercontent.com>)... failed: Temporary failure in name resolution.
    wget: unable to resolve host address '<http://raw.githubusercontent.com|raw.githubusercontent.com>'
    --2023-04-11 13:47:49--  <https://raw.githubusercontent.com/airbytehq/airbyte-platform/main/docker-compose.yaml>
    Resolving <http://raw.githubusercontent.com|raw.githubusercontent.com> (<http://raw.githubusercontent.com|raw.githubusercontent.com>)... failed: Temporary failure in name resolution.
    wget: unable to resolve host address '<http://raw.githubusercontent.com|raw.githubusercontent.com>'
    k
    • 2
    • 2
  • t

    Thiago Villani

    04/11/2023, 2:53 PM
    Hello, I have a connection with MSSQL source and destination minio(S3), when syncing to .parquet format, I get an error as shown below, I noticed that they are in mssql columns in bigint format, but if I run the sync to .json or .csv goes successfully, do you have any tips for a solution? connector_command=write}],stacktrace=tech.allegro.schema.json2avro.converter.AvroConversionException: Failed to convert JSON to Avro: Could not evaluate union, field cgccpf is expected to be one of these: NULL, INT. If this is a complex type, check if offending field (path: cgccpf) adheres to schema: 10145286000157 at tech.allegro.schema.json2avro.converter.JsonGenericRecordReader.read(JsonGenericRecordReader.java:131) at tech.allegro.schema.json2avro.converter.JsonGenericRecordReader.read(JsonGenericRecordReader.java:120) at tech.allegro.schema.json2avro.converter.JsonAvroConverter.convertToGenericDataRecord(JsonAvroConverter.java:95) at io.airbyte.integrations.destination.s3.avro.AvroRecordFactory.getAvroRecord(AvroRecordFactory.java:39) at io.airbyte.integrations.destination.s3.parquet.ParquetSerializedBuffer.accept(ParquetSerializedBuffer.java:97) at io.airbyte.integrations.destination.record_buffer.SerializedBufferingStrategy.addRecord(SerializedBufferingStrategy.java:90) at io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer.acceptTracked(BufferedStreamConsumer.java:174)
    k
    w
    • 3
    • 4
  • s

    Sandhya Manimaran

    04/11/2023, 3:28 PM
    Hi Team, I need help in finding out the call back url for Airbyte
    k
    m
    • 3
    • 8
  • k

    Krzysztof Sikora

    04/11/2023, 4:31 PM
    there used to be an export/import feature in the UI, is that fully gone or somehow still accessible?
    k
    • 2
    • 2
  • r

    Raphael Pacheco

    04/11/2023, 4:57 PM
    Hello everyone, everything good? I got to know Airbyte a little while ago and I need to look for parquet files inside a GCS folder. The problem is that this data is partitioned and we have several snappy.parquet inside this folder. I tried using the connector for GCS, but I was not able to successfully connect. I also used the connector for files, but it asks for a unique url to the file. I thought that using a wildcard character (*) to just get everything .parquet in the folder would work, but that didn’t work either. Could you guide me how I can make this connection? Thank you so much guys!
    k
    • 2
    • 5
  • d

    Dominik Mall

    04/11/2023, 5:18 PM
    Hello! My sync logs often spew out hundreds or thousands of these messages:
    Copy code
    2023-04-11 17:13:49 DEBUG i.a.w.RecordSchemaValidator(validateSchema):75 - feature flag disabled for workspace 80d44133-1d1f-4709-9ea9-95248a689b77
    1. Does anyone know what this actually means/how to resolve it? (The connectors otherwise work fine) 2. Aside from increasing the log-level from DEBUG to something else, can this message be suppressed? (Using Airbyte OSS 0.41.0 on GKE)
    k
    • 2
    • 5
  • w

    Walker Philips

    04/11/2023, 5:21 PM
    Is there a way to control the order of connection settings from the spec.yaml/json file that is displayed on the UI?
    k
    • 2
    • 2
  • k

    Kevin Phan

    04/11/2023, 6:24 PM
    hey all, We are getting these logs from testing airbyte connection locally postgres -> snowflake. Any ideas? cc @Katie Aszklar
    Copy code
    2023-04-10 19:08:05 INFO i.a.w.t.TemporalAttemptExecution(get):105 - Docker volume job log path: /tmp/workspace/042ab8f0-40a1-43b3-a39c-7c97ea41c034/0/logs.log
    2023-04-10 19:08:05 INFO i.a.w.t.TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.40.0-alpha
    2023-04-10 19:08:05 INFO i.a.c.i.LineGobbler(voidCall):83 - Checking if airbyte/destination-postgres:0.3.21 exists...
    2023-04-10 19:08:05 INFO i.a.c.i.LineGobbler(voidCall):83 - airbyte/destination-postgres:0.3.21 not found locally. Attempting to pull the image...
    2023-04-10 19:09:32 INFO i.a.c.i.LineGobbler(voidCall):83 - Pulled airbyte/destination-postgres:0.3.21 from remote.
    2023-04-10 19:09:32 INFO i.a.w.p.DockerProcessFactory(create):108 - Creating docker job ID: 042ab8f0-40a1-43b3-a39c-7c97ea41c034
    2023-04-10 19:09:32 INFO i.a.w.p.DockerProcessFactory(create):163 - Preparing command: docker run --rm --init -i -w /data/042ab8f0-40a1-43b3-a39c-7c97ea41c034/0 --log-driver none --name destination-postgres-check-042ab8f0-40a1-43b3-a39c-7c97ea41c034-0-qqosw --network host -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -e DEPLOYMENT_MODE=OSS -e USE_STREAM_CAPABLE_STATE=true -e AIRBYTE_ROLE= -e WORKER_ENVIRONMENT=DOCKER -e WORKER_CONNECTOR_IMAGE=airbyte/destination-postgres:0.3.21 -e WORKER_JOB_ATTEMPT=0 -e AIRBYTE_VERSION=0.40.0-alpha -e WORKER_JOB_ID=042ab8f0-40a1-43b3-a39c-7c97ea41c034 airbyte/destination-postgres:0.3.21 check --config source_config.json
    2023-04-10 19:09:33 ERROR i.a.c.i.LineGobbler(voidCall):83 - SLF4J: Class path contains multiple SLF4J bindings.
    2023-04-10 19:09:33 ERROR i.a.c.i.LineGobbler(voidCall):83 - SLF4J: Found binding in [jar:file:/airbyte/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    2023-04-10 19:09:33 ERROR i.a.c.i.LineGobbler(voidCall):83 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    2023-04-10 19:09:33 ERROR i.a.c.i.LineGobbler(voidCall):83 - SLF4J: Found binding in [jar:file:/airbyte/lib/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    2023-04-10 19:09:33 ERROR i.a.c.i.LineGobbler(voidCall):83 - SLF4J: Found binding in [jar:file:/airbyte/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    2023-04-10 19:09:33 ERROR i.a.c.i.LineGobbler(voidCall):83 - SLF4J: See <http://www.slf4j.org/codes.html#multiple_bindings> for an explanation.
    2023-04-10 19:09:33 ERROR i.a.c.i.LineGobbler(voidCall):83 - SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
    2023-04-10 19:09:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2023-04-10 19:09:35 INFO i.a.i.d.p.PostgresDestination(main):90 - starting destination: class io.airbyte.integrations.destination.postgres.PostgresDestination
    2023-04-10 19:09:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2023-04-10 19:09:35 INFO i.a.i.b.IntegrationCliParser(parseOptions):118 - integration args: {check=null, config=source_config.json}
    2023-04-10 19:09:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2023-04-10 19:09:35 INFO i.a.i.b.IntegrationRunner(runInternal):123 - Running integration: io.airbyte.integrations.base.ssh.SshWrappedDestination
    2023-04-10 19:09:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2023-04-10 19:09:35 INFO i.a.i.b.IntegrationRunner(runInternal):124 - Command: CHECK
    2023-04-10 19:09:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2023-04-10 19:09:35 INFO i.a.i.b.IntegrationRunner(runInternal):125 - Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
    2023-04-10 19:09:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2023-04-10 19:09:35 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-04-10 19:09:35 INFO i.a.w.i.DefaultAirbyteStreamFactory(lambda$create$0):61 - 2023-04-10 19:09:35 WARN c.n.s.JsonMetaSchema(newValidator):338 - Unknown keyword examples - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    k
    • 2
    • 2
  • t

    Thiago Villani

    04/11/2023, 7:25 PM
    Hello, I have a connection with source mssql server, and destination minio(S3), I'm generating a .CSV, but the file goes with a name based on the date, can I keep the file name with a fixed name, like same name as the source table.
    k
    • 2
    • 3
1...178179180...245Latest