https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Andrew Nessin R

    04/19/2023, 6:19 AM
    Hi Team, I have a MySQL server that I would like to connect to as Airbyte's source. I don't have access to the MySQL server process. I am able to access it from MySql workbench without any issues, but Airbyte throws an error. To simulate this scenario, I ran a standalone mysql server with this command:
    docker container run -d -p 3306:3306 -e MYSQL_ROOT_PASSWORD=password mysql:5.7.27 mysqld
    I ran this code from a Java program:
    package com.example;
    import java.sql.Connection;
    import java.sql.DriverManager;
    import java.sql.ResultSet;
    import java.sql.SQLException;
    import java.sql.Statement;
    public class App
    {
    public static void main(String[] args) {
    // JDBC connection parameters
    String url = "jdbc:<mysql://10.141.14.93:3306/TestDB>"; // Replace with your MySQL database URL
    String username = "root"; // Replace with your MySQL username
    String password = "password"; // Replace with your MySQL password
    try {
    // Step 1: Register the JDBC driver
    Class.forName("com.mysql.cj.jdbc.Driver");
    // Step 2: Open a connection
    Connection connection = DriverManager.getConnection(url, username, password);
    // Step 3: Create a statement
    Statement statement = connection.createStatement();
    // Step 4: Execute a query
    String sql = "SELECT * FROM Friends"; // Replace with your SQL query
    ResultSet resultSet = statement.executeQuery(sql);
    // Step 5: Process the result
    while (resultSet.next()) {
    int id = resultSet.getInt("id"); // Replace with your column names
    String name = resultSet.getString("name");
    int age = resultSet.getInt("age");
    System.out.println("ID: " + id + ", Name: " + name + ", Age: " + age);
    }
    // Step 6: Close the resources
    resultSet.close();
    statement.close();
    connection.close();
    } catch (ClassNotFoundException | SQLException e) {
    e.printStackTrace();
    }
    }
    }
    Running this program gives me the following error, which is very useful to debug the problem:
    Copy code
    Wed Apr 19 11:29:30 IST 2023 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
    com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure
    
    The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
            at com.mysql.cj.jdbc.exceptions.SQLError.createCommunicationsException(SQLError.java:590)
            at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:57)
            at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:1606)
            at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:633)
            at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:347)
            at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:219)
            at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:677)
            at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:228)
            at com.example.App.main(App.java:22)
    Caused by: com.mysql.cj.core.exceptions.CJCommunicationsException: Communications link failure
    
    The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
            at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
            at com.mysql.cj.core.exceptions.ExceptionFactory.createException(ExceptionFactory.java:54)
            at com.mysql.cj.core.exceptions.ExceptionFactory.createException(ExceptionFactory.java:93)
            at com.mysql.cj.core.exceptions.ExceptionFactory.createException(ExceptionFactory.java:133)
            at com.mysql.cj.core.exceptions.ExceptionFactory.createCommunicationsException(ExceptionFactory.java:149)
            at com.mysql.cj.mysqla.io.MysqlaProtocol.negotiateSSLConnection(MysqlaProtocol.java:309)
            at com.mysql.cj.mysqla.authentication.MysqlaAuthenticationProvider.negotiateSSLConnection(MysqlaAuthenticationProvider.java:769)
            at com.mysql.cj.mysqla.authentication.MysqlaAuthenticationProvider.proceedHandshakeWithPluggableAuthentication(MysqlaAuthenticationProvider.java:482)
            at com.mysql.cj.mysqla.authentication.MysqlaAuthenticationProvider.connect(MysqlaAuthenticationProvider.java:204)
            at com.mysql.cj.mysqla.io.MysqlaProtocol.connect(MysqlaProtocol.java:1414)
            at com.mysql.cj.mysqla.MysqlaSession.connect(MysqlaSession.java:132)
            at com.mysql.cj.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:1726)
            at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:1596)
            ... 6 more
    Caused by: javax.net.ssl.SSLHandshakeException: No appropriate protocol (protocol is disabled or cipher suites are inappropriate)
            at java.base/sun.security.ssl.HandshakeContext.<init>(HandshakeContext.java:170)
            at java.base/sun.security.ssl.ClientHandshakeContext.<init>(ClientHandshakeContext.java:103)
            at java.base/sun.security.ssl.TransportContext.kickstart(TransportContext.java:222)
            at java.base/sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:449)
            at java.base/sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:427)
            at com.mysql.cj.core.io.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:156)
    When I connect to the same mysql server from Airbyte, I get the attached error. Here is the full logs:
    Copy code
    2023-04-19 06:00:37 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationCliParser(parseOptions):126 integration args: {check=null, config=source_config.json}
    2023-04-19 06:00:37 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationRunner(runInternal):105 Running integration: io.airbyte.integrations.base.ssh.SshWrappedSource
    2023-04-19 06:00:37 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationRunner(runInternal):106 Command: CHECK
    2023-04-19 06:00:37 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationRunner(runInternal):107 Integration config: IntegrationConfig{command=CHECK, configPath='source_config.json', catalogPath='null', statePath='null'}
    2023-04-19 06:00:38 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):165 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword order - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-04-19 06:00:38 WARN i.a.w.i.DefaultAirbyteStreamFactory(internalLog):165 - WARN c.n.s.JsonMetaSchema(newValidator):278 Unknown keyword airbyte_secret - you should define your own Meta Schema. If the keyword is irrelevant for validation, just use a NonValidationKeyword
    2023-04-19 06:00:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.s.SshTunnel(getInstance):204 Starting connection with method: NO_TUNNEL
    2023-04-19 06:00:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO c.z.h.HikariDataSource(<init>):80 HikariPool-1 - Starting...
    2023-04-19 06:00:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO c.z.h.HikariDataSource(<init>):82 HikariPool-1 - Start completed.
    2023-04-19 06:01:38 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO c.z.h.HikariDataSource(close):350 HikariPool-1 - Shutdown initiated...
    2023-04-19 06:01:39 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO c.z.h.HikariDataSource(close):352 HikariPool-1 - Shutdown completed.
    2023-04-19 06:01:39 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.b.IntegrationRunner(runInternal):182 Completed integration: io.airbyte.integrations.base.ssh.SshWrappedSource
    2023-04-19 06:01:39 INFO i.a.w.i.DefaultAirbyteStreamFactory(internalLog):168 - INFO i.a.i.s.m.MySqlSource(main):311 completed source: class io.airbyte.integrations.source.mysql.MySqlSource
    2023-04-19 06:01:40 INFO i.a.w.g.DefaultCheckConnectionWorker(run):120 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@679a75b1[status=failed,message=State code: 08S01; Message: Communications link failure
    
    The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.]
    2023-04-19 06:01:40 INFO i.a.w.t.TemporalAttemptExecution(get):169 - Stopping cancellation check scheduling...
    2023-04-19 06:01:40 INFO i.a.c.i.LineGobbler(voidCall):149 - 
    2023-04-19 06:01:40 INFO i.a.c.i.LineGobbler(voidCall):149 - ----- END CHECK -----
    2023-04-19 06:01:40 INFO i.a.c.i.LineGobbler(voidCall):149 -
    The original Java error has been turned into a single line, which significantly reduces its usefulness:
    Copy code
    2023-04-19 06:01:40 INFO i.a.w.g.DefaultCheckConnectionWorker(run):120 - Check connection job received output: io.airbyte.config.StandardCheckConnectionOutput@679a75b1[status=failed,message=State code: 08S01; Message: Communications link failure
    
    The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.]
    If I try to view the logs of the temporary connector container while the connection is being tested by Airbyte, I get this:
    Copy code
    ubuntu@air1:~$ docker container logs -f source-mysql-check-f0e7688e-0d95-441f-bbee-78fc45b84e82-0-dlscp 
    Error response from daemon: configured logging driver does not support reading
    The error message from Java program suggests to use
    useSSL=false
    in JDBC URL. If I do that in Airbyte, then the connection succeeds, I have attached a screenshot of this as well. Now the question is, how do I access the full Java exception that occurred while connecting from Airbyte? The above example shows Airbyte swallowed up useful error logs. In my case, I can't access the MySQL server and Airbyte is probably swallowing up useful logs, just like in the example above. I could run this same Java program against the MySQL server that won't connect and see if it produces useful error messages. However, I would like to know if Airbyte is swallowing error logs or is there a different place I can look at to get the full error logs? The logs I have shared is from the UI. I see the same logs when I run
    docker container logs airbyte-worker
    .
  • y

    Yogic Wahyu

    04/19/2023, 6:43 AM
    Hi team, wanna ask about Airbyte’s notification webhook, can we use it to detect/inform any reset jobs? Used version : 0.40.28
    k
    • 2
    • 4
  • a

    Ananya Singh

    04/19/2023, 8:36 AM
    Hello, Does Airbyte has a capability of automatically generating and emitting a catalog that describes available streams and schema in the source?
    k
    • 2
    • 2
  • t

    Tristan Crudge

    04/19/2023, 10:16 AM
    Hey, Have a question regarding method input params, specifically
    cursor_field
    . More info in thread.
    k
    • 2
    • 3
  • t

    Tahar Ben Achour

    04/19/2023, 10:16 AM
    Hello I deployed airbyte on kubernetes with helm I am using version 0.44.0 all pods are running except webapp which crashes logs show the following error
    kubectl -n airbyte logs airbyte-webapp-68b8d6d5c9-m6xgr
    Copy code
    /docker-entrypoint.sh: /docker-entrypoint.d/ is not empty, will attempt to perform configuration
    /docker-entrypoint.sh: Looking for shell scripts in /docker-entrypoint.d/
    /docker-entrypoint.sh: Launching /docker-entrypoint.d/10-listen-on-ipv6-by-default.sh
    10-listen-on-ipv6-by-default.sh: info: Getting the checksum of /etc/nginx/conf.d/default.conf
    10-listen-on-ipv6-by-default.sh: info: Enabled listen on IPv6 in /etc/nginx/conf.d/default.conf
    /docker-entrypoint.sh: Launching /docker-entrypoint.d/20-envsubst-on-templates.sh
    20-envsubst-on-templates.sh: Running envsubst on /etc/nginx/templates/default.conf.template to /etc/nginx/conf.d/default.conf
    /docker-entrypoint.sh: Launching /docker-entrypoint.d/30-tune-worker-processes.sh
    /docker-entrypoint.sh: Configuration complete; ready for start up
    2023/04/19 10:07:32 [emerg] 1#1: unknown "airbyte_version" variable
    nginx: [emerg] unknown "airbyte_version" variable
    Any idea ? Thank you
    k
    • 2
    • 2
  • r

    Rui Santos

    04/19/2023, 10:21 AM
    heys guys, not so technical question but I couldn’t find anywhere: what’s the difference between the airbyte and the airbyte-platform repos, for someone who’s trying to deploy AirByte on a EC2 instance? 🤔 Thaanks
    ✅ 1
    k
    m
    • 3
    • 4
  • m

    Moaz ElNashar

    04/19/2023, 10:50 AM
    Hello folks, I have an issue with Airbyte setup on Kubernetes Deployment: Helm Chart Airbyte Version: 0.40.32 The problem is we have worker and server pods consumes a lot of memory (Server > 5 GB and Worker > 3 GB) which grows by time until the machine gets overloaded and we make a restart for the pods.
    m
    i
    • 3
    • 4
  • i

    ishan

    04/19/2023, 10:57 AM
    How can I convert the 'BASIC_AUTH_PASSWORD' env variable to a docker secret?
    k
    m
    • 3
    • 3
  • n

    Nina Jensen

    04/19/2023, 1:18 PM
    Is it not possible to update connection schemas via the configuration API? I am trying to recreate the schema discover workflow with API calls, and while I can discover schema changes just fine using
    /web_backend/connections/get
    but updating with
    /web_backend/connections/update
    has no effect, even though that is seemingly what happens internally looking at logs. What am I missing here?
    k
    • 2
    • 3
  • j

    James (he/him)

    04/19/2023, 1:32 PM
    Hi everyone. i’ve submitted a PR to make some small improvements in the google ads connector: https://github.com/airbytehq/airbyte/pull/25314 https://github.com/airbytehq/airbyte/pull/25284 Not sure where the best place to get a review is so i’m posting here 🙂
    k
    m
    • 3
    • 4
  • d

    Dale Gilliam

    04/19/2023, 2:02 PM
    Hello, I’m wondering if anyone else has seen this and know if there’s some configuration issue here or if it’s a bug. Connector: Paypal Transaction I’m pulling balances, and the payload says it would have an object
    balances
    , but that’s missing. It has everything else, which is basically nothing. Here’s an example of what is emitted in the raw JSON:
    Copy code
    {
      "account_id": "XXXXXXXXX",
      "last_refresh_time": "2023-04-18T15:29:59Z",
      "as_of_time": "2023-04-16T00:00:00+00:00"
    }
    k
    • 2
    • 5
  • r

    Robert Put

    04/19/2023, 2:03 PM
    When using google sheets as a source it asks for permissions to all of my google sheets content.... 1. Is there some way to limit this to 1 specific sheet? 2. Is it even worth using this source if it maps all the data types to a string? https://docs.airbyte.com/integrations/sources/google-sheets I'm on airbyte cloud
    k
    • 2
    • 3
  • r

    Rafael Rossini

    04/19/2023, 2:41 PM
    My airbyte is being synced daily using a cron expression, but if I leave my ec2 with airbyte hosted online it keeps syncs even using the cron to sync one time per day.
    k
    m
    • 3
    • 5
  • l

    LeftKlick

    04/19/2023, 4:25 PM
    I have been running into this issue syncing a fairly large table from postgresql to clickhouse. The solution seems to be to increase the
    send_receive_timeout
    (its already been bumped up on clickhouse). Is there a way I can test this out while I wait for the permanent fix to be pushed into a release?
    k
    • 2
    • 3
  • l

    Luis Peña

    04/19/2023, 4:52 PM
    Hello, I was wondering if this is the right channel to ask this: Are there any plans on Microsoft SQL Server (MSSQL) to add a feature to use Change Tracking for data updates? Reference: https://learn.microsoft.com/en-us/sql/relational-databases/track-changes/about-change-tracking-sql-server?view=sql-server-ver16
    k
    s
    +4
    • 7
    • 14
  • j

    Jason Gluck

    04/19/2023, 6:03 PM
    How would airbyte handle a cursor field, say
    updated_at
    , with null values in a source table for an incremental sync? Does it pick up the null values on an initial sync, or ignore them? Asking to determine if the
    updated_at
    field needs to be backfilled in the source, or if it would be okay to leave historical records null.
    k
    • 2
    • 2
  • s

    Sachin Patidar

    04/19/2023, 7:32 PM
    I am trying to setup the connection for source as Salesforce and destination as IceBerg… connectin’s syncing process is getting successful with recordsCommitted but when i am trying to query Iceberg table am getting no records and also only tree airbyte related columns. May i know if anyone tried setting up destination as IceBerg and get it working..
    k
    • 2
    • 2
  • t

    Tobias Macey

    04/19/2023, 7:40 PM
    Does anyone have a sense of whether it would be possible to add some lightweight transformation in a destination connector? In particular I'm thinking of explicitly casting nested JSON documents that don't have a
    properties
    definition into properly escaped strings so that they can be processed by e.g. Trino within the s3-glue connector. It builds on top of the
    destination-s3
    connector and writes the records in JSONL format.
    k
    • 2
    • 2
  • j

    Jan Pavel

    04/19/2023, 7:50 PM
    I have a connection between MS SQL Server and BigQuery. I have set up CDC on MS SQL Server. I am using Incremental Sync - Deduped History. SCD historization is not working correctly because CDC on MS SQL Server uses local time zone instead of UTC. In BigQuery the difference is 2 hours. How can I set the time zone change to UTC in Airbyte? Thank you.
  • e

    Ethan Veres

    04/19/2023, 8:53 PM
    What’s the current status with deploying Airbyte to ECS Fargate? All of our apps are running using ECS Fargate and we don’t have the expertise in kubernetes to get this working
    k
    m
    • 3
    • 4
  • m

    Mauricio Alarcon

    04/19/2023, 9:23 PM
    Hello, quick question - has anybody been able to use Snowflake as a source? I know this connector is in alpha; I’ve tried several versions, and the latest two failed for something that doesn’t make sense, as it complains that there’s no active warehouse for the source. Even though it’s selected and it’s also passed as a JDBC Params.
    Copy code
    Stack Trace: java.lang.RuntimeException: java.lang.RuntimeException: net.snowflake.client.jdbc.SnowflakeSQLException: No active warehouse selected in the current session.  Select an active warehouse with the 'use warehouse' command.
    plus1 1
    k
    k
    h
    • 4
    • 25
  • d

    Dan Cook

    04/19/2023, 9:41 PM
    Environment • Airbyte version: Airbyte Cloud • Source Connector and version: Salesforce 2.0.9, as of today • Destination Connector and version: Snowflake 0.4.61, as of today Current Behavior Some columns in our Salesforce objects are formula fields, in other words computed from other SF fields (sometimes in other SF objects). Incremental Salesforce sync is typically based on
    SystemModstamp
    , but that value doesn't change when a formula field is updated based on some other object. Therefore a row won't get synced until it undergoes a change to 'native' column. This is a problem for us, and to get around it I have a theory that I can schedule a once-daily [Full Refresh | Overwrite] sync and then one or more [Incremental | Append] syncs to the same table later in the same day. Therefore all columns of all rows, incl. formula fields, will get updated at least once a day. Does this sound feasible?
    k
    b
    • 3
    • 9
  • m

    Micky

    04/19/2023, 9:55 PM
    Hi everyone, I set sync mode to incremental deduped history. But I found out some data we deleted in the source still in the destination. Does this mode detect the data deletion? Or any suggestion on sync mode? Thanks.
    k
    l
    a
    • 4
    • 6
  • m

    Marcus Vicentini

    04/20/2023, 1:18 AM
    Hi guys, I'm new to airbyte and I am making some tests with SalesForce and Google Sheets Connector. When I trigger the connection it takes a long time to extract all the data. SalesForce connector takes about 40 minutes to extract all data. The tables are not too big, about 50k rows. The CSV connector takes 15 minutes to extract just 1 file with 100k rows. Is there a way to speed up theses processes? I'm using Airbyte Core on Docker compose
    k
    • 2
    • 2
  • r

    RJ

    04/20/2023, 2:02 AM
    - Is this your first time deploying Airbyte?: Yes - OS Version / Instance: My development machine is macOS, Airbyte is running in Docker on Amazon Linux - Memory / Disk: 8 GiB / 30 GiB - Deployment: Docker - Airbyte Version: 0.44.1 - Source name/version: Convex 0.1.0 - Destination name/version: ClickHouse 0.2.3 - Step: Creating the connection. - Description: I have a fresh instance of Airbyte running in an EC2 instance, and am trying to create a connection between Convex (the source) and ClickHouse (the destination), which is hosted by ClickHouse Cloud. The Convex connection worked just fine, but the ClickHouse connection is failing with the following error:
    #### Configuration check failed
    Could not connect with provided configuration. java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 60001ms.
    I found this issue, which suggested that perhaps I ought to have added
    socket_timeout=300000
    to the “JDBC URL Params” field. I did this, and observed no change in behavior. I also found this issue, which suggests that the
    socket_timeout
    JDBC URL parameter is not currently being respected.
    k
    r
    j
    • 4
    • 7
  • s

    Slackbot

    04/20/2023, 2:21 AM
    This message was deleted.
    k
    • 2
    • 2
  • s

    Slackbot

    04/20/2023, 2:22 AM
    This message was deleted.
    k
    • 2
    • 2
  • t

    Tony Peng

    04/20/2023, 3:07 AM
    Hi, I am trying to set up Google BigQuery as the destination. In the field “Service Account Key JSON (Required for cloud, optional for open-source)“, I input:
    /Users/tony3/Downloads/name-of-the-service-account-key.json
    But I got an error message:
    Configuration check failed
    com.google.gson.stream.MalformedJsonException: Use JsonReader.setLenient(true) to accept malformed JSON at line 1 column 2 path $
    My service account id has the following format:
    {
    "type": "service_account",
    "project_id": "made-with-ml-384201",
    "private_key_id": "kEY_ID",
    "private_key": "-----BEGIN PRIVATE KEY-----\nPRIVATE_KEY\n-----END PRIVATE KEY-----\n",
    "client_email": "SERVICE_ACCOUNT_EMAIL",
    "client_id": "CLIENT_ID",
    "auth_uri": "<https://accounts.google.com/o/oauth2/auth>",
    "token_uri": "<https://oauth2.googleapis.com/token>",
    "auth_provider_x509_cert_url": "<https://www.googleapis.com/oauth2/v1/certs>",
    "client_x509_cert_url": "<https://www.googleapis.com/robot/v1/metadata/x509/my-made-with-ml%40made-with-ml-384201.iam.gserviceaccount.com>"
    }
    , which seems OK. Appreciate for your help! Tony
    k
    • 2
    • 3
  • r

    Rajesh Koilpillai

    04/20/2023, 3:43 AM
    Can we use Airbyte Configuration API https://raw.githubusercontent.com/airbytehq/airbyte/master/airbyte-api/src/main/openapi/config.yaml to fetch the list of existing connections in a Airbyte OSS instance?
    k
    • 2
    • 8
  • r

    Rajesh Koilpillai

    04/20/2023, 5:40 AM
    Airbyte source-rss connector takes url as a property as per https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/source-rss/source_rss/spec.yaml. Would it be possible to pass connector configuration specified in connector specification as per spec.yaml while triggering a job using Airbyte API
    k
    • 2
    • 2
1...184185186...245Latest