https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • o

    Octavia Squidington III

    07/10/2023, 7:45 PM
    🔥 Community Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 1pm PDT click here to join us on Zoom!
  • m

    Mark McKelvey

    07/10/2023, 10:12 PM
    Is Airbyte down? I'm unable to access any active workspaces.
    k
    • 2
    • 3
  • p

    P W

    07/10/2023, 11:42 PM
    Airbyte 0.50.7 destination Mysql 10.6 from Postgres error: Database Error in model customers (models/generated/airbyte_tables/jaffle_shop/customers.sql) 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'RETURNING CHAR) as last_name, json_value(_airbyte_data, '$."id"' RET...' at line 14 compiled SQL at ..
    k
    • 2
    • 2
  • c

    Chidambara Ganapathy

    07/11/2023, 12:43 AM
    Hi we have used Airbyte API’s in our application to connect to data sources The thing is airbyte’s source create API is getting called auccessfully even if I don’t pass the required inputs (eg : stripe , Aha) how to handle this
    k
    • 2
    • 2
  • n

    Nazif Ishrak

    07/11/2023, 5:44 AM
    I have question about how to load bulk data from a massive table without a lot of resources I am using airbyte and my database is in GCP. I need to sometimes do incremental sync based on updated_at and created_at cursor fields but airbyte doesnt support multiple cursor field what can I do
    k
    • 2
    • 2
  • e

    Ekansh Verma

    07/11/2023, 7:34 AM
    Hi team! While deploying airbyte on EKS using HELM charts and trying to push logs to S3 bucket, When I try to sync a connection I get a toast stating "Failed to start sync: Internal Server Error: The specified bucket does not exist (Service: S3, Status Code: 404, Request ID: , Extended Request ID: )". The sync completes but the logs are missing.
    k
    • 2
    • 2
  • n

    Nisha Biradar

    07/11/2023, 8:05 AM
    Is there a way to integrate Spark based transformations with Airbyte syncs?
    k
    • 2
    • 2
  • e

    Ekansh Verma

    07/11/2023, 9:27 AM
    Hi team! While deploying airbyte on EKS using HELM charts and trying to push logs to S3 bucket, When I try to sync a connection I get a toast stating " AWS Access Key Id you provided does not exist in our records. (Service: S3, Status Code: 403, Request ID: , ". The sync completes but the logs are missing.
    k
    • 2
    • 2
  • r

    Rutger Weemhoff

    07/11/2023, 9:40 AM
    Hello all, I am running version 0.50.7 and having trouble setting up webhook notifications, the same as @Qamarudeen Muhammad experienced a couple of months ago. (https://airbytehq.slack.com/archives/C021JANJ6TY/p1680865391482929) It does not matter if I test a Slack notification hook or any other (like ifttt.com). In all cases I receive this error:
    Copy code
    Webhook test failed. Please verify that the webhook URL is valid.
    On my previous version (0.43.2) I had exactly the same issue so it does not seem to be related to my recent version upgrade. I am running Airbyte on Docker. The webhook urls work correctly if I test them with
    curl
    from the virtual machine where docker runs, or even from the docker container.
    k
    • 2
    • 2
  • m

    Moti Zamir

    07/11/2023, 11:21 AM
    Hi there! can anyone help me to connect facebook marketing? I get facebookapierror for that
    k
    s
    r
    • 4
    • 4
  • d

    Dipankar Kumar

    07/11/2023, 11:35 AM
    Hi Team, I have recently upgraded Airbyte to latest 44.4 version, its become very slow, taking 3-4 mins time to appear the home page after login into it.
    k
    • 2
    • 2
  • n

    Nadav Amami

    07/11/2023, 11:48 AM
    Hi, in the docs it says that oauth connectors like google-ads, are created in few steps, 1. get the authorization url by call the Initiate OAuth for a Source passing redirect uri 2. implement a callback in my product that will receive secret_id parameter after the user has authenticated using his google account 3. use the secret_id to create the google ads source without "That secret ID can be used to create a source with credentials in place of actual tokens." but looking in create source api i don't see how this can be setup i'll appreciate some help
    k
    u
    • 3
    • 3
  • s

    Stan Gorch

    07/11/2023, 12:14 PM
    Hi, can you please explain one thing connected with sync. A bit of context. I've luanch Airbyte locally in docker and I've set Source mysql and Destination BigQuery. Also I've created a Connect between them and set incremental sync. My Source's tables are huge enough (about 20 mln of records). Now sync is going well but I can't see any data in BigQuery. In Job's logs a can see only that some amount of rows were loaded but where?) Is there any intermediate storage or whatever?
    k
    • 2
    • 3
  • l

    lisandro maselli

    07/11/2023, 12:57 PM
    Hi, i have a mysql slave that is attach to airbyte cdc, in some databases takes more than 8 hours to complete the extraction of 30Mb. checking the logs it buffers bytes of data in every cycle. what are the possible problems?
    k
    • 2
    • 3
  • o

    Octavia Squidington III

    07/11/2023, 1:45 PM
    🔥 Office Hours starts in 15 minutes 🔥 Topic and schedule posted in #C045VK5AF54 octavia loves At 16:00 CEST / 10am EDT click here to join us on Zoom!
  • r

    Richard Anthony Hein (Auxon)

    07/11/2023, 2:41 PM
    I'm trying to create an MSSQL to JSON connection but no streams show up when configuring the destination connection; what am I missing?
    • 1
    • 2
  • š

    Šimon Appelt

    07/11/2023, 3:17 PM
    Hey, we just upgraded Airbyte from 0.40.x to 0.50.7 and the CDC connection from Postgres to BigQuery where we used Incremental + Deduped mode stopped working. In fact, Airbyte only creates
    _airbyte_raw
    tables and nothing else, is this expected behaviour for the new version?
    k
    • 2
    • 3
  • j

    Joey Hernandez

    07/11/2023, 3:23 PM
    Hi All, I am trying to set up a Redshift source to a Redshift destination, connection, but I keep getting this error: Discovering schema failed Something went wrong in the connector. See the logs for more details. at com.google.common.collect.ImmutableMap$Builder.put(ImmutableMap.java:449) at io.airbyte.integrations.source.jdbc.AbstractJdbcSource.getColumnMetadata(AbstractJdbcSource.java:268) at io.airbyte.db.jdbc.JdbcDatabase$1.tryAdvance(JdbcDatabase.java:81) at java.base/java.util.Spliterator.forEachRemaining(Spliterator.java:332) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) at io.airbyte.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:56) at io.airbyte.integrations.source.jdbc.AbstractJdbcSource.discoverInternal(AbstractJdbcSource.java:191) at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:90) at io.airbyte.integrations.source.redshift.RedshiftSource.discoverInternal(RedshiftSource.java:30) at io.airbyte.integrations.source.relationaldb.AbstractDbSource.discoverWithoutSystemTables(AbstractDbSource.java:255) at io.airbyte.integrations.source.relationaldb.AbstractDbSource.getTables(AbstractDbSource.java:509) at io.airbyte.integrations.source.relationaldb.AbstractDbSource.discover(AbstractDbSource.java:110) at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:129) at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:98) at io.airbyte.integrations.source.redshift.RedshiftSource.main(RedshiftSource.java:136) 2023-07-11 145636 WARN i.a.w.g.DefaultDiscoverCatalogWorker(run):105 - Discover job subprocess finished with exit codee 1 2023-07-11 145636 INFO i.a.w.t.TemporalAttemptExecution(get):163 - Stopping cancellation check scheduling...
    k
    • 2
    • 2
  • s

    Stan Gorch

    07/11/2023, 3:31 PM
    Hello again! I have misunderstanding how first incremental sync is working. A bit of context. I’ve launch Airbyte locally in docker and I’ve set Source Mysql and Destination BigQuery. Also I’ve created a Connect between them and set incremental sync. My Source’s tables are huge enough (about 40 mln of records). In logs I have a lot of committed rows but in BQ nothing. Does it mean that somewhere on local this data is stored or what is happening? Could you please explain in detail how incremental sync is happened. P.S. with small tables it’s working ok
    🙏 1
  • j

    Joey Hernandez

    07/11/2023, 3:34 PM
    Hello, If we upgrade a source connector does it upgrade all existing source connectors as well?
    k
    • 2
    • 2
  • s

    Slackbot

    07/11/2023, 5:31 PM
    This message was deleted.
    k
    • 2
    • 2
  • b

    Brian Mullins

    07/11/2023, 7:38 PM
    👋 Hello, team! Can anyone help me set up source with Netsuite? I keep getting the following error "HTTPError('400 Client Error: Bad Request for url"
    k
    • 2
    • 4
  • e

    Ekansh Verma

    07/11/2023, 8:27 PM
    Hello! Regarding deployment of airbyte on EKS with HELM. If the credentials of an AWS user that has not created the EKS cluster are being used for pushing the logs to S3 bucket. What are the necessary permissions which this user must have? Does he require any permissions on the cluster as well or will the permissions on S3 bucket be suffice?
    k
    • 2
    • 2
  • t

    Tony Cookey

    07/11/2023, 9:02 PM
    Hi Everyone, Please is it possible to tag rows (raw json data) with an
    id
    either a
    connectionId
    or a
    randomId
    when streaming data from multiple sources to a single destination (Github <<>> Postgres) so as to identify what data came from what connection. I want to stream multiple Github Sources to one Postgres. I want to tag each row and know what connection(stream) owns that row of data. Please what are the possible ways of doing this. Open to solutions Thanks
    k
    • 2
    • 2
  • m

    mangole

    07/11/2023, 9:20 PM
    Hey team, We're looking to implement a data preview for our connections. This can become handy, as the user can see portion of the data before syncing it. I'm aware that it's not achievable out of the box, and looking for a way to implement it on top of Airbyte OSS. I would appreciate any guidance regarding where to start, which components should I be aware. I'm looking for extend the capability, rather than apply changes to the core. Any help will be much appreciated 🙏
    k
    • 2
    • 2
  • r

    Rhys Davies

    07/11/2023, 9:40 PM
    Hey all, Running into an annoying bug that I hope someone can shed some light on. I have an MS SQL Server database as a Source that I sync 30 tables from to a Postgres database. I need to enabled CDC because
    Full Refresh | Overwrite
    mode
    DROPs
    the tables and writes them fresh, which also drops any views I have that are dependent on these tables and I am at the stage of this project where I need to start transforming the warehoused data. I can’t currently do this though because in testing the connection fails every time with CDC enabled
    Caused by: java.time.format.DateTimeParseException: Text '-2208988800000' could not be parsed at index 11
    with this error. To be clear these are the same tables and same data as the other Source I have set up for this same database. It seemingly fails when the date is set to
    1900-01-01 00:00:00.00
    but I also note that Airbyte sees all date/time fields in the SQL Server as
    String
    . Is there any way I can fix this because it’s a real blocker, I am happy to roll up my sleeves and just write some Python and deploy my syncing service but it seems like such a huge fall at the last hurdle for Airbyte and this project I’m working on… Thanks in advance!
    k
    • 2
    • 2
  • s

    Sai Charan

    07/12/2023, 6:00 AM
    Hi everyone :) I have connected my shopify store to big query using airbyte. The connection was successful and tables are getting created. But the data is not getting populated in the table. It's just saying - "There is no data to display" Can anyone please help?? 🙏🙏🙏🤌🤌
    k
    • 2
    • 2
  • s

    Slackbot

    07/12/2023, 6:24 AM
    This message was deleted.
    k
    • 2
    • 2
  • e

    Ekansh Verma

    07/12/2023, 6:46 AM
    Hi Team! While deploying the airbyte using HELM on EKS, I am trying to push the logs to s3 bucket with the following configuration.
    Copy code
    logs:
        accessKey:
          password: "ACCESS_KEY"
          existingSecret: ""
          existingSecretKey: ""
        secretKey:
          password: "SECRET_KEY"
          existingSecret: ""
          existingSecretKey: ""
        storage:
          type: "S3"
        minio:
          enabled: false
        externalMinio:
          enabled: false
          host: localhost
          port: 9000
        s3:
          enabled: true
          bucket: "bucket_name"
          bucketRegion: "us-east-1"
    The Bucket exists and the credentials are valid and have the correct access to updated files on the bucket as well. Now while after the installation the worker and server fail due to
    Copy code
    static void validateBase(final S3ApiWorkerStorageConfig s3BaseConfig) {
        Preconditions.checkArgument(!s3BaseConfig.getBucketName().isBlank());}
    This validation check which essentially means my bucket name is blank. Any reasons why I am encountering this as the bucket name is clearly mentioned?
    k
    • 2
    • 2
  • c

    Caio César P. Ricciuti

    07/12/2023, 7:09 AM
    hey all! Hope all is good... I have a mysql => Bigquery connection where after an update the table is not created anymore. Before I get the
    airbyte_raw_mydata
    and a table named
    mydata
    now I just get the
    airbyte_raw_
    tables... anyone had this issue? mysql source v2.1.0 bigquery destination v1.5.1
    k
    • 2
    • 3
1...211212213...245Latest