https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Alexis Blandin

    12/27/2022, 2:48 PM
    Hi, I take time again to work on it and I'm still struggling with Airbyte compose with pg on RDS. To be more precise, I have this error on worker container
    Copy code
    Micronaut (v3.7.4)
    
    2022-12-27 13:31:29 INFO i.m.c.e.DefaultEnvironment(<init>):159 - Established active environments: [ec2, cloud, control-
    plane]
    2022-12-27 13:31:29 INFO c.z.h.HikariDataSource(<init>):71 - HikariPool-1 - Starting...
    2022-12-27 13:31:30 INFO c.z.h.HikariDataSource(<init>):73 - HikariPool-1 - Start completed.
    2022-12-27 13:31:30 INFO c.z.h.HikariDataSource(<init>):71 - HikariPool-2 - Starting...
    2022-12-27 13:31:30 INFO c.z.h.HikariDataSource(<init>):73 - HikariPool-2 - Start completed.
    2022-12-27 13:31:30 INFO i.m.l.PropertiesLoggingLevelsConfigurer(configureLogLevelForPrefix):107 - Setting log level 'DE
    BUG' for logger: 'io.airbyte.bootloader'
    2022-12-27 13:31:31 INFO i.a.w.c.DatabaseBeanFactory(configsDatabaseMigrationCheck):119 - Configs database configuration
    : 0.35.15.001 60000
    2022-12-27 13:31:31 ERROR i.m.r.Micronaut(handleStartupException):338 - Error starting Micronaut server: Error instantia
    ting bean of type  [io.airbyte.persistence.job.factory.OAuthConfigSupplier]
    
    Message: SQL [select * from airbyte_metadata where key = ?]; ERROR: relation "airbyte_metadata" does not exist
      Position: 15
    Before starting the container, db was empty, and after the relation/table was created and filled with this values (user has all privileges on database)
    Copy code
    airbyte_config=> select * from airbyte_metadata ;
                 key              | value
    ------------------------------+-------
     airbyte_protocol_version_min | 0.0.0
     airbyte_protocol_version_max | 0.3.0
    (2 rows)
    On the docker-compose.yml i put in comments the db service as well as other services depencences and volumes related to it. If somebody can help me on this. Thanks !
    s
    • 2
    • 1
  • l

    laila ribke

    12/27/2022, 4:49 PM
    Hi all, for the Nordigen api source connector I just realized that I receive key pair credentials for getting the token. Therefore I cant use it as is in the OAuthAuthenticator. I first need to get the token and aferwards put the token in the header of the endpoint api call. Does someone did it in another source and can send me the yaml file or any ideas/docs to follow. I´m truely lost
    m
    • 2
    • 4
  • b

    Balasubramanian T K

    12/27/2022, 5:05 PM
    Hey could someone give the link for most requested connectors
    ✅ 1
    m
    • 2
    • 1
  • t

    Temidayo Azeez

    12/26/2022, 4:16 PM
    I am trying to follow this tutorial to setup a pipeline between mysql and postgres. For some reasons I don't know, the source(mysql) just keeps failing and I couldn't set it up. From what I can see in the UI, I am using mysqlbeta as the source connector. Latest version 1.0.18. I also added my server logs https://airbyte.com/tutorials/migrate-from-mysql-to-postgresql
    ✅ 1
    u
    • 2
    • 4
  • l

    Leonardo de Almeida

    12/27/2022, 6:14 PM
    I've found a lot of erros in discuss like:
    Copy code
    HikariPool-2 - Connection org.postgresql.jdbc.PgConnection@6fe46b62 marked as broken because of SQLSTATE(08006), ErrorCode(0)
    I've facing this issue since august but cannot find any solution. Anyone going through this and can help me out?
    m
    s
    +2
    • 5
    • 5
  • j

    Joviano Cicero Costa Junior

    12/27/2022, 7:39 PM
    Hey guys. I am exploring CDC loads and after delete records from source at final table the delete was not replicating for target. Using option INCREMENTAL APPEND, all records are persisted on the target but I can't know how data is alive, I need to use RAW TABLES to identify that. Using option INCREMENTAL DEDUP the records persists on target table and _SCD table like the record is valid.
    u
    u
    • 3
    • 3
  • a

    Adham Suliman

    12/27/2022, 7:47 PM
    I’m running a local instance of airbyte, and I’m attempting to use a timestamp filter within my request parameter for an incremental build. My airbyte syncs are returning no records with the latest start time produced by the latest record, but records are returned when I attempt the request with the same timestamp in postman. The requests for the source aren’t produced in the logs. Which container should I begin to explore to find the API request made during a sync?
    s
    • 2
    • 2
  • t

    Thao Ton

    12/27/2022, 10:19 PM
    can anyone help to explain the difference between sync mode from source and destination? I want to use the incremental sync feature, but not clear if they are supported. For example, airbyte document said MSSQL destination connector does NOT support ‘Incremental’, and the source connector mssql supports ‘Incremental Sync - Append’
    s
    • 2
    • 2
  • h

    Hansal Shah

    12/28/2022, 4:35 AM
    Is there a way to extract data from the Google Cloud Platform using airbyte?
    e
    n
    • 3
    • 4
  • ö

    Özgür Sallancı

    12/23/2022, 6:18 AM
    this is the yaml file i created, i just try to call this Alchemy endpoint with a wallet address ( https://docs.alchemy.com/reference/getnfts )
    • 1
    • 1
  • i

    Ilkka Peltola

    12/28/2022, 7:50 AM
    I installed airbyte on ec2 with 100GB storage, set up staging to S3, and overnight it ran out of storage. I suspect the main cause was the 'Recurly' connector, which is in Alpha, but any generic advice on how to have Airbyte not run out of disk space?
  • t

    Till Blesik

    12/28/2022, 9:46 AM
    Hi everyone, I am developing a connector using the low code / configuration. I am wondering if there is a way to support multiple authorization methods (e.g. api token and OAuth2)?
    m
    • 2
    • 2
  • g

    Giorgos Tzanakis

    12/28/2022, 10:39 AM
    Hi all. I'm a new user of Airbyte cloud, I will be syncing AWS Aurora MySQL tables to BigQuery, using CDC replication method. I have a question about schema changes. In the docs, I read
    We do not support schema changes automatically for CDC sources. We recommend resetting and resyncing data if you make a schema change
    I have 2 questions: • If a field is added in one of the source tables, is this going to break the sync? Or it is merely going to be ignored? • Is there a plan to make Airbyte handle somehow MySQL schema changes? (I found this github issue, but there are no updates) Thank you!
    🙏 2
    u
    u
    • 3
    • 5
  • z

    Zaza Javakhishvili

    12/28/2022, 10:55 AM
    Hi 🙂 Guys please review and approve Amazon Seller Partner small fix. We are using this connector and need to get fix ASAP.
    n
    • 2
    • 5
  • f

    Felipe Bonzanini

    12/28/2022, 3:02 PM
    Hey everyone, I hope you all are doing great. I am trying Airbyte on a POC, and there are 2 things I would get some help with: 1. When sync fails, the TMP tables are not deleted, and I get two types of tmp tables: airbyte_tmp and _dbt_tmp - I learned I could delete the first, but what about the latter? 2. I am hosting Airbyte on a GCP Virtual Machine and the root filesystem keeps filling in. When I recycle Docker or the entire VM, it goes down again — what I can do as of maintenance to avoid problems? I have been researching for this two topics, but if anybody could point me in the right direction, I would be very thankful! Happy Holidays!
    n
    u
    • 3
    • 4
  • m

    Manish Tomar

    12/28/2022, 4:56 PM
    Does Airbyte need to resync the bulk load again if we Rename even one column ?
    s
    • 2
    • 4
  • s

    Sunny Rekhi

    12/28/2022, 5:33 PM
    hey folks, I'm trying to contribute a connector to Airbyte. I added a connection locally and it isn't showing up in the UI when I run
    docker-compose up
    . I even modified other connector names in
    source_definitions.yaml
    to see if the changes show up in the UI, I'm not seeing the UI source list change at all. I thought running
    VERSION=dev docker-compose up
    would solve the problem, but then I get this error:
    Copy code
    terminal$ VERSION=dev docker-compose up
    WARNING: The RUN_DATABASE_MIGRATION_ON_STARTUP variable is not set. Defaulting to a blank string.
    WARNING: The DEPLOYMENT_MODE variable is not set. Defaulting to a blank string.
    WARNING: The LOG_CONNECTOR_MESSAGES variable is not set. Defaulting to a blank string.
    WARNING: The SHOULD_RUN_NOTIFY_WORKFLOW variable is not set. Defaulting to a blank string.
    WARNING: The SECRET_PERSISTENCE variable is not set. Defaulting to a blank string.
    WARNING: The JOB_ERROR_REPORTING_SENTRY_DSN variable is not set. Defaulting to a blank string.
    WARNING: The NEW_SCHEDULER variable is not set. Defaulting to a blank string.
    WARNING: The WORKER_ENVIRONMENT variable is not set. Defaulting to a blank string.
    WARNING: The GITHUB_STORE_BRANCH variable is not set. Defaulting to a blank string.
    WARNING: The REMOTE_CONNECTOR_CATALOG_URL variable is not set. Defaulting to a blank string.
    WARNING: The TEMPORAL_HISTORY_RETENTION_IN_DAYS variable is not set. Defaulting to a blank string.
    WARNING: The UPDATE_DEFINITIONS_CRON_ENABLED variable is not set. Defaulting to a blank string.
    Pulling bootloader (airbyte/bootloader:dev)...
    ERROR: manifest for airbyte/bootloader:dev not found: manifest unknown: manifest unknown
    Any idea how to fix? Am running on Mac M1. Regular
    docker-compose up
    works fine The docs say:
    Note that modifications to source_definitions.yaml will only be picked-up the first time you start Airbyte, or when you upgrade Airbyte, or if you entirely wipe our your instance of Airbyte and start from scratch.
    I've shut down the docker containers, and git pulled the latest code to force an update. That didn't work.
    m
    • 2
    • 1
  • k

    Ketan Mangukiya

    12/28/2022, 8:06 PM
    Hello Team, Hope you are doing well I have one question for airbyte which api i have to use for test connection using google auth code , workspace id , sourceDefinitionId and etc ? Thanks in advance
    e
    • 2
    • 1
  • r

    Rohith Reddy

    12/29/2022, 3:21 AM
    Hi Team, is there way to deploy a fork of connector on to an existing airbyte opensource deoloyment ?
    m
    • 2
    • 1
  • r

    Rohith Reddy

    12/29/2022, 9:10 AM
    I am facing an issue syncing from bigquery source to redshift with the error the response is too large to return
    e
    • 2
    • 3
  • r

    Rohith Reddy

    12/29/2022, 9:19 AM
    can anyone help with a guide for building, deploying forks of https://github.com/airbytehq/airbyte
    u
    • 2
    • 1
  • d

    Dany Chepenko

    12/29/2022, 9:27 AM
    Hey, has anyone succeeded in connecting to Linkedin API here? I followed the instructions on the docs but failed to retrieve the access code.
    Copy code
    {
      "error": "invalid_redirect_uri",
      "error_description": "Unable to retrieve access token: appid/redirect uri/code verifier does not match authorization code. Or authorization code expired. Or external member binding exists"
    }
    I'm using the
    https<//airbyte.io>
    as a
    redirect_uri
    my get request is the following:
    Copy code
    <https://www.linkedin.com/oauth/v2/authorization?response_type=code&client_id=78oy2gu644mxz2&redirect_uri=https%3A%2F%2Fairbyte.io&scope=r_ads,r_ads_reporting,r_organization_social>
    e
    • 2
    • 1
  • d

    Dipankar Sarkar

    12/29/2022, 10:23 AM
    Hello Team,
  • d

    Dipankar Sarkar

    12/29/2022, 10:28 AM
    Hello Team, Hope you are doing well. I have one query to know. I try to build a simple ETL process using Airbyte. My source is MSSQL server and destination is also that. I want to use only some basic SQL queries to take the data but unfortunately, I can't find any way to use that. Can anyone help me to get that. Thanks in advance.
    m
    • 2
    • 1
  • g

    Giorgos Tzanakis

    12/29/2022, 11:12 AM
    Hi all. Question about Octavia. Is this a tool that can be used with a paid airbyte cloud account, or is this for self-hosted solutions only? Thank you.
    m
    • 2
    • 4
  • n

    Nguyễn An

    12/29/2022, 3:14 PM
    Hi team. I have a questions about loading data from S3 to Clickhouse. My S3 folder is organized as myFolder/table1/*.json myFolder/table2/*.json myFolder/table3/*.json I want to load these tables in to separate tables in Clickhouse, is it possible to create a single connection to load all tables or I need to create separate connection for each of these? If the later is the case, is there any way to create the connection programmatically?
    n
    • 2
    • 1
  • j

    Jens Teichert

    12/29/2022, 3:45 PM
    MongoDB Source: ISO Timestamps show up as strings workaround? Hi all, we’re currently trying to setup a MongoDB -> BigQuery sync and are currently blocked because iso strings in mongodb will show up as strings rather then timestamps in our destinations -> Makes aggregations unnecessary hard. I know that MongoDB is in alpha but was wondering if there is an easy to setup/maintain workaround available? Might be related to: https://github.com/airbytehq/airbyte/issues/10031 Thanks!
    s
    • 2
    • 1
  • p

    Pablo Morales

    12/29/2022, 4:45 PM
    Hi, I'm deploying Airbyte locally (OS: Windows 11 pro) to fix a source-Shopify problem. The problem is that it doesn't recognize the changes we make to the files (using Visual Studio code). Doing docker-compose up does not detect any changes, and the problem persists. Any idea what is happening? I thank you for the help in advance. PS: To put the changes in context when extracting orders, there is a bug in which, instead of discount_applications, it detects discount_allocations, and it always comes out empty. We need to be able to pull this data from discount_applications.
    e
    m
    • 3
    • 4
  • s

    Santiago Stachuk

    12/29/2022, 6:22 PM
    Hi dear people! I'm having a peculiar problem with Facebook Marketing connector. I'm ingesting data from a stream
    ads_insights
    and the first run fetchs all the records, until here, works good. but when the next sync its starting, no records are fetched, and this behavior keeps repeating the following days. Something curious about this is that, for the same source and configuration, in my
    dev
    environment it works like a charm, but in
    stage
    I have this problem. A solution for now is to reset the connection and hope for the best, but is there a known issue that could be generating this problem? Thank you in advance for your time
    e
    s
    • 3
    • 5
  • r

    Robert Put

    12/29/2022, 7:28 PM
    Anyone know how to send sync failures to pagerduty? Would using: https://developer.pagerduty.com/docs/ZG9jOjExMDI5NTgw-events-api-v2-overview work?
    m
    a
    • 3
    • 12
1...115116117...245Latest