https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • v

    viritha vanama

    04/07/2023, 3:53 PM
    anyone getting this error when connecting to google ads from airbyte cloud
  • c

    Chen Huang

    04/07/2023, 4:52 PM
    Is there a way for Airbyte connectors to read a list of ids from storage or database before making REST API calls? My question is on Jira connector that we want to pre-process a list of issue ids for the connector to make API call per id. Is this a commonly faced issue? I'm wondering how and what's the standard way to solve this type of issues? Thanks
    k
    • 2
    • 3
  • a

    alex

    04/07/2023, 6:40 PM
    Hi - I’m having trouble validating a local set up of an Airbyte/MongoAtlas connection. Has anyone seen a
    Failed to run connection tests
    error in the UI?
    k
    • 2
    • 5
  • r

    Remi Nonnon

    04/07/2023, 6:44 PM
    Hi there, I've got a question about the mysql source connector. I don't find a clear answers regarding consistency. I've tens of tables to sync, some of these table have foreign keys and I'm worry about the coherent state of what I'm going to sync. Example: Table A could have a column as foreign key to Table B id. While the airbyte sync run, table A could be fully Synced with reference to some rows of table B. But during that time an update could occurs and table B could delete/update some rows, and when the Table B sync run, I will have an inconstent state between my synced Table A and the new update in table B. When starting a sync, airbyte doesn't acquire a lock on all tables, right ? So, I could have inconsistent state between tables. I wonder if I need to create views to ensure that a sync could offers me a better consistency or maybe I'm completly out of subject 🤔 Thank you for your help 🥰
    k
    • 2
    • 2
  • d

    DJ Miracle

    04/07/2023, 6:59 PM
    Hi there, I'm using a local deployment of airbyte with a connection built from mysql (Mariadb 10.5.19-MariaDB-1:10.5.19+maria~deb10) -> Snowflake. I have tested out some smaller tables and our largest table and for the smaller table all records were moved successfully. However, for the larger test there was a discrepancy in records. The source table contains about 100M records and the loaded snowflake table was off (less) just over 15k records. I used a full refresh overwrite sync, the logs show there were no errors and the sync completed successfully. The versions of the connectors are both up to date as far as I can tell, 2.0.13 for MySQL and 0.1.34 for Snowflake. Any ideas on where I can start looking? Thanks, DJ
    k
    • 2
    • 3
  • g

    Gabriel Levine

    04/07/2023, 8:01 PM
    Anyone have tips on how to access the logs for a job when they’re too large to be displayed in the UI? My jobs log to a GCS bucket, but I can’t see the path in the UI, so I can’t find the logs. Version is 0.42.1.
    k
    • 2
    • 2
  • s

    Sam Kmetz

    04/08/2023, 12:13 AM
    Hi All! I've been working with Airbyte for a bit now to move data from a few postgres dbs to snowflake and it's really great! However, one question/ concern I have is that airbyte clusters all of the tables in snowflake really inefficiently (by unique key) leading to drastically increased (and unnecessary) snowflake spend on large tables that require reclustering. As an example, we move about 200MB an hour and this has cost us ~$1,000 a month in unnecessary snowflake credit spend (on enterprise snowflake). Clustering is super powerful when it's done right, but it's not an index which seems like how this was treated. Is there a way to turn this off? I haven't tried swapping the table out with a manually created one, but don't want to break anything. I am aware that I can just import it as a json column, but that takes away from some of the elegance of the product (ability to have tables show up just as they are in the source). There have been some questions posted in here before, but no really good explanations around why it's an issue, and if it's on the roadmap to correct. I would expect that as larger clients start to use airbyte further this will become more and more of an issue. Snowflake is one of the more frequently used destinations I would assume and an area of great growth. Clustering can actually be part of the solution, but it would need to be done based on the cursor timestamp field aggregated to say a daily frequency. That way, snowflake only has to open the most recent few files when running the airbyte queries to insert or update. That being said, for the time being it would be better to turn it off entirely. More than happy to discuss this further and would love to contribute from the snowflake side. I just don't know java.
    k
    e
    m
    • 4
    • 6
  • a

    Akash

    04/08/2023, 4:18 AM
    I've got a couple of questions about building HTTP API connector using Airbyte CDK for Python. Is it possible to make post requests using the CDK?
    k
    j
    • 3
    • 6
  • a

    Akash

    04/08/2023, 6:14 AM
    How can I create a class to make a POST Request to my API in the http api python CDK structure?
    k
    • 2
    • 5
  • q

    Qamarudeen Muhammad

    04/08/2023, 1:21 PM
    I run into various issues with Airbyte version 0.43.0, the latest been inability to set this Tag TEMPORAL_HISTORY_RETENTION_IN_DAYS=7 in env file as after adding the tag to ". env" file all previous working connections stop working and I get the following errors despite data source are different, "Sync Failed 0 Bytesno recordsno records36s Failure Origin: source, Message: State code: 08001; Message: The connection attempt failed. 12:53PM 04/08"
    k
    • 2
    • 2
  • a

    Akash

    04/08/2023, 7:00 PM
    Hello Team, I've got a question that has been bugging me for a while. Lets consider the example of Klarna's Source connector. Is it impossible to make a POST Request (as I'd like to make a POST Request in my case to an API) using a HTTP source connector? Klarna Source Connector Where ever I go I only see GET Requests being made to a certain API and I was wondering how I can do POST Requests instead and get data back and stuff
    k
    • 2
    • 5
  • j

    Jeffrey Zhang

    04/09/2023, 9:42 AM
    Hello All.Can I use airbyte to sync blob data from s3 to s3?
    k
    • 2
    • 3
  • j

    Jin Gong

    04/09/2023, 3:24 PM
    Does airbyte support sync frequency less than 1 hour for CDC of postgres -> snowflake integration? I was assuming CDC could be near real-time sync
    i
    • 2
    • 7
  • a

    Akash

    04/09/2023, 6:02 PM
    Okay, so I have a custom data warehouse with me and a jdbc driver to run it. I want users of my source connector to be able to run sql operations using the source connector. How can I make this exactly? I'd appreciate any help I can get. Please let me know if you need more details
    k
    • 2
    • 2
  • x

    xi-chen.qi

    04/10/2023, 3:43 AM
    hi,teams.If source=Google sheet format, how should airbyte correctly recognize the sheet format?
  • t

    Tomas Draksas

    04/10/2023, 5:55 AM
    hey guys, maybe anyone knows how to connect apify run success and airbyte transfer?
    k
    • 2
    • 2
  • a

    Abhishek Kale

    04/10/2023, 9:46 AM
    Hii team i wanted to know what connectors are available in free account and also let me know how many records can i download from free account
    k
    • 2
    • 2
  • s

    Shreepad Khandve

    04/10/2023, 9:52 AM
    Hi team, I am getting the same error as below :
  • s

    Shreepad Khandve

    04/10/2023, 9:52 AM
    I have deployed airbyte on aws ec-2 instance, one for development and another for production. I have created one custom connector along with testing in dev instance. Uploaded that image on dev interface and it worked. I have pushed the same image to aws ecr repository and pulled the same in production instance and on instance when i try to create new connector i'm getting above error as image. Let me know if im missing something. dev version - 0.43.0, prod version - 0.40.32
    k
    • 2
    • 2
  • s

    Slackbot

    04/10/2023, 12:49 PM
    This message was deleted.
    k
    • 2
    • 2
  • s

    Sushant

    04/10/2023, 1:21 PM
    Hi .. With the introduction of the new AIRBYTE API what is the future of the Configuration API that currently open source account can use ? will the support for configuration API still be available in future ? Also, If we want to switch to Airbyte-API, can we do it using open-source account itself or airbyte cloud is mandatory ?
    k
    • 2
    • 2
  • n

    nagarjuna

    04/10/2023, 1:31 PM
    👋 Hello, team! i am using the existing amazon ads connector, but i dont see all the fields in the output json file. i tried to add the required fields in the source-amazon-ads-->source_amazon_ads-->schemas--> sponsored_brands.py. but still i dont see these fields in the final output file. pls suggest/share if anyone has some info on this one.
    k
    • 2
    • 2
  • c

    Chaochao Zhang

    04/10/2023, 1:35 PM
    Hi there, I’m following the official doc to deploy Airbyte on AWS EKS (v1.25.6), could someone help check this failure? Thank you! It seems
    airbyte-webapp
    and
    airbyte-server
    are failed because of
    airbyte-temporal
    , here are some logs from the pod
    airbyte-temporal-56db585db5-cg8ps
    .
    Copy code
    PostgreSQL started.
    + echo 'PostgreSQL started.'
    + setup_schema
    + '[' postgresql == mysql ']'
    + '[' postgresql == postgresql ']'
    + echo 'Setup PostgreSQL schema.'
    + setup_postgres_schema
    Setup PostgreSQL schema.
    + SCHEMA_DIR=/etc/temporal/schema/postgresql/v96/temporal/versioned
    + '[' temporal '!=' airbyte ']'
    + temporal-sql-tool --plugin postgres --ep airbyte-db-svc -u airbyte -p 5432 create --db temporal
    2023-04-10T13:25:12.095Z        ERROR   Unable to create SQL database.  {"error": "pq: database \"temporal\" already exists", "logging-call-at": "handler.go:97"}
    2023/04/10 13:25:12 Loading config; env=docker,zone=,configDir=config
    2023/04/10 13:25:12 Loading config files=[config/docker.yaml]
    {"level":"info","ts":"2023-04-10T13:25:12.134Z","msg":"Updated dynamic config","logging-call-at":"file_based_client.go:143"}
    {"level":"info","ts":"2023-04-10T13:25:12.134Z","msg":"Starting server for services","value":["history","matching","frontend","worker"],"logging-call-at":"server.go:123"}
    {"level":"info","ts":"2023-04-10T13:25:12.160Z","msg":"PProf not started due to port not set","logging-call-at":"pprof.go:67"}
    [Fx] SUPPLY     *resource.BootstrapParams
    [Fx] SUPPLY     chan struct {}
    ...
    {"level":"info","ts":"2023-04-10T13:25:12.661Z","msg":"Received a ring changed event","service":"matching","component":"service-resolver","service":"frontend","logging-call-at":"rpServiceResolver.go:219"}
    {"level":"info","ts":"2023-04-10T13:25:12.661Z","msg":"Current reachable members","service":"matching","component":"service-resolver","service":"frontend","addresses":["169.254.172.30:7233"],"logging-call-at":"rpServiceResolver.go:266"}
    {"level":"info","ts":"2023-04-10T13:25:12.661Z","msg":"Received a ring changed event","service":"matching","component":"service-resolver","service":"matching","logging-call-at":"rpServiceResolver.go:219"}
    {"level":"fatal","ts":"2023-04-10T13:25:22.603Z","msg":"error starting scanner","service":"worker","error":"context deadline exceeded","logging-call-at":"service.go:233","stacktrace":"<http://go.temporal.io/server/common/log.(*zapLogger).Fatal|go.temporal.io/server/common/log.(*zapLogger).Fatal>\n\t/temporal/common/log/zap_logger.go:150\ngo.temporal.io/server/service/worker.(*Service).startScanner\n\t/temporal/service/worker/service.go:233\ngo.temporal.io/server/service/worker.(*Service).Start\n\t/temporal/service/worker/service.go:153\ngo.temporal.io/server/service/worker.ServiceLifetimeHooks.func1.1\n\t/temporal/service/worker/fx.go:80"}
    g
    k
    • 3
    • 7
  • d

    DJ Miracle

    04/10/2023, 3:41 PM
    I just wanted to follow up on this. Please let me know if there is another channel or escalation path I should use.
    k
    • 2
    • 2
  • s

    Slackbot

    04/10/2023, 4:32 PM
    This message was deleted.
  • t

    Trung Luong

    04/10/2023, 4:36 PM
    Hi All, We are trying to make a connection to DB2 server but Airbyte is throwing an error. We made sure the credentials are right but Airbyte is unable to make a connection. Have anyone been able to connect to DB2 and provide some help? Thanks.
    k
    • 2
    • 2
  • j

    Jamil B

    04/10/2023, 5:23 PM
    Hello, my airbyte instance on AWS just ate all 100Gb dedicated to it, I can't seem to figure out where the space went. Running a bunch of
    du
    commands, I traced down the big consumption to
    /var/lib/docker/overlay2
    . I'm not sure how to clean up the space at this point, any recommendations?
    k
    d
    • 3
    • 3
  • s

    Slackbot

    04/10/2023, 6:19 PM
    This message was deleted.
    k
    • 2
    • 2
  • k

    Konstantin Lackner

    04/10/2023, 6:20 PM
    I am having an issue with Google Analytics 4 (GA4) source connector. There is discrepancy between the data shown directly in the GA4 dashboard, and the data that is loaded into Looker Studio from the website_overview stream via BigQuery. Can someone explain this discrepancy and ideally suggest a solution? Airbyte version: 0.42.0 GA4 Source Connector version: 0.1.3 BigQuery Destination version: 1.2.18
  • a

    Adem Zeqiri

    04/10/2023, 6:23 PM
    Dear Community, I am considering Airbyte for a PoC I am doing for a client. The issue is the I have to test the CDC too and the source is in Oracle. Based on the Airbyte documentation the CDC support for Oracle is coming soon but its not there now. Do we know approximately when will the version that supports Oracle CDC come out ? Thank you in advance.
    k
    • 2
    • 2
1...177178179...245Latest