https://linen.dev logo
Join SlackCommunities
Powered by
# ask-community-for-troubleshooting
  • a

    Arun

    07/27/2022, 10:32 PM
    Hey Folks, I am new to Airbyte.. Can we install Airbyte without docker?
    m
    • 2
    • 1
  • w

    Willi

    07/28/2022, 10:07 AM
    Hello, how would it be possible to have a loading strategy which is a “full-refresh append partition”, but only create one new partition per day. Multiple runs per day should replace the partition of the current day. It seems that airbyte allows to overwrite a table but not a partition. How could I implement that?
  • k

    Kevin Phan

    07/28/2022, 2:19 PM
    hello everyone, I am creating a new connector for my Select Star source and in total from the api, there are 1750 tables, where the tables differ in col names and schema. Id want to ingest all those tables in S3 via airbyte. I am stuck on the discover method, more specifically, how would I define the discover method, and json schemas for all of these tables? Seems like I could have a function that generates json schema dynamically on the fly in order to be used for the airbyte catalog -> configured catalog. Maybe I am misunderstanding something? Thanks for the help. @Alex Marquardt (Airbyte) @Alexandre Airvault @ anyone else
    a
    • 2
    • 5
  • g

    Gabriel Lecointere

    07/29/2022, 3:59 AM
    Hello team I'm constantly getting {"error":"invalid_grant","error_description":"expired access/refresh token"} when trying to set up my first Source from Salesforce. 🤔 even tho I use the refresh token property I receive after completing the OAuth flow. Any ideas? My SDFC connected app is set to "Refresh token is valid until revoked" and no time out set. I'm using Airbyte OSS Thanks in advance Solved: I was toggling "Sandbox" since it was a dev account. It turned out to not be a sandbox.
  • j

    Jason Maddern

    07/29/2022, 5:02 AM
    I’m not sure if this is the correct place or not, so if not please feel free to redirect me. I’m new to airbyte, and would like to build a HTTP connector to sync data from one of our products into our Snowflake warehouse. I know the airbyte website promises to build it in 2 hrs (lol) but I’m not a python dev. I’m thinking that this is one of those cases where we may be best to outsource it to skilled people - does anyone know where or with whom I may be able to engage to have a connector built (which will be used internally by us, but also later published on the catalog for our clients)?
    m
    • 2
    • 2
  • p

    Pavel Filimoshkin

    07/29/2022, 7:36 AM
    Hello team! I have some problem with Amplitude source on normalization stage: “Failure Origin: normalization, Message: Something went wrong during normalization 2:02PM 07/29 3 attempts 2022-07-29 071710 - Additional Failure Information: message=‘io.temporal.serviceclient.CheckedExceptionWrapper: java.util.concurrent.ExecutionException: java.lang.RuntimeException: io.airbyte.workers.exception.WorkerException: Running the launcher normalization-orchestrator failed’, type=‘java.lang.RuntimeException’, nonRetryable=false” How can I solve this and who can help me?
    🙏 1
  • u

    404 Roy

    07/29/2022, 9:28 AM
    Modified the code in the airbytes-workers file, how to rebuild, restart. Overwrite the original container operation, how can I solve this, who can help me?
  • a

    Aditi Bhalawat

    07/29/2022, 11:18 AM
    Hello All,I need help for creating python Airbyte source performing incremental sync.I followed the source created by Source-Faker.But that didn't worked rather it is still storing every data.I used function
    Copy code
    def get_stream_cursor(state: Dict[str, any], stream: str) -> int:
        cursor = (state[stream]["cursor"] or 0) if stream in state else 0
    
        return cursor
    to get cursor.How will I resolve it
  • j

    Julien

    07/29/2022, 1:58 PM
    Hi 🙂 👋 I was wondering if there is a way to setup connections as code in a Kubernetes environment ? (with a GitOps tool like ArgoCD for instance)
    a
    r
    • 3
    • 5
  • m

    Martijn van Leeuwerden

    07/29/2022, 2:03 PM
    Hello! If I want to pull data from Facebook for every user in my database and the users that will sign up later how is the best to go about this? The source setup requires an id that is unique to the user. Does that mean I need to setup a connection for every user individually? Do I do this through the Airbyte API? Or do I need to write a custom connector to achieve this? Looking forward to understand more how I can use Airbyte in this case! Thanks!
  • s

    Sourav Sikka

    07/29/2022, 6:11 PM
    Hello, I just started using airbyte & want to load data from google sheet to bigquery. I am able to make source connection (google sheet), but i am getting below error in destination setup (bigquery)
    Copy code
    The connection tests failed.
    java.io.IOException: The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See <https://developers.google.com/accounts/docs/application-default-credentials> for more information.
  • m

    Mina Yacoub

    07/29/2022, 10:34 PM
    Hello Guys, I am trying to integrate Microsoft Dataverse with Other legacy Databases using APIs, Is Airbyte is a good ELT for such ?
  • z

    Zaza Javakhishvili

    07/30/2022, 8:53 AM
    Hi guys, Can I ask about ssl support for the destination Mariadb columnstore? I am using SkySQL cloud database and they are not support connection without ssl.
  • j

    john eipe

    07/30/2022, 8:07 PM
    Hi Team, I have got airbyte running on container on Mac. I have a local mysql server running on Mac and I'm trying to setup source in airbyte but I get this error.
    Could not connect with provided configuration. Error:
    HikariPool-1 - Connection is not available, request timed out after 60006ms.
    since airbyte is in a container and mysql on local machine (Mac) - I tried with all these host values:
    0.0.0.0
    ,
    host.docker.internal
    ,
    docker.for.mac.localhost
    but none worked.
    s
    • 2
    • 5
  • a

    Alex Bondar

    07/31/2022, 12:06 PM
    hey, after airbyte bump, getting backoff restarting of db pod:
    Copy code
    PostgreSQL Database directory appears to contain a database; Skipping initialization
    
    2022-07-31 12:03:57.486 UTC [1] LOG:  starting PostgreSQL 13.7 on x86_64-pc-linux-musl, compiled by gcc (Alpine 11.2.1_git20220219) 11.2.1 20220219, 64-bit
    2022-07-31 12:03:57.487 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
    2022-07-31 12:03:57.487 UTC [1] LOG:  listening on IPv6 address "::", port 5432
    2022-07-31 12:03:57.493 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
    2022-07-31 12:03:57.510 UTC [21] LOG:  database system was shut down at 2022-07-31 11:54:11 UTC
    2022-07-31 12:03:57.511 UTC [21] LOG:  record with incorrect prev-link 80010E98/8010E98 at 2/EBE44970
    2022-07-31 12:03:57.511 UTC [21] LOG:  invalid primary checkpoint record
    2022-07-31 12:03:57.511 UTC [21] PANIC:  could not locate a valid checkpoint record
    2022-07-31 12:03:57.686 UTC [1] LOG:  startup process (PID 21) was terminated by signal 6: Aborted
    2022-07-31 12:03:57.686 UTC [1] LOG:  aborting startup due to startup process failure
    2022-07-31 12:03:57.687 UTC [1] LOG:  database system is shut down
    updated to 0.39.37-alpha
  • y

    Yoosuf Zimaam

    08/01/2022, 8:20 AM
    Hello, I would like to know if there's a way to use ClickUp as a source, or if I have to build my own connector for it. ClickUp offers their API key so what would be the easiest way to load data from ClickUp to a PostgreSQL database?
    c
    • 2
    • 1
  • r

    Roberto Malcotti

    08/01/2022, 4:38 PM
    Ehy team, can you please confirm that at the moment: • using Airbyte cloud is not possible to add custom connectors • it is not possibile to interact with Airbyte using its API Thanks!
    m
    • 2
    • 1
  • a

    Arun

    08/01/2022, 10:58 PM
    Hi -- How to install Docker Desktop in Windows Server 2016. As I want to install Airbyte on Windows Server but unfortunately its tightly coupled with Docker service. But there is no direct option to install Docker in windows server. Could you please someone guide me on this?
  • s

    Svatopluk Chalupa

    08/02/2022, 9:09 AM
    Hi all, is there a way how to vote for a specific feature request? Is there any prioritization mechanism for airbyte according the public voices? Thanks for answers :-)
    h
    • 2
    • 4
  • z

    Zawar Khan

    08/02/2022, 11:40 AM
    Hi Everyone, I have a situation where I don’t get a refresh-token. I want to cache my token so it can be passed in header of every stream.
  • p

    Piyawat Pavachansatit

    08/01/2022, 5:21 AM
    Hi, now I'm working on the POST API Request to get the job information from airbyte job, so I use /v1/jobs/get api following https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html but I got this error
    Copy code
    <html>
    <head><title>405 Not Allowed</title></head><script language="javascript"\>
                    window.TRACKING_STRATEGY = "segment";
                    window.FULLSTORY = "";
                    window.AIRBYTE_VERSION = "0.35.57-alpha";
                    window.API_URL = "/api/v1/";
                    window.IS_DEMO = "";
                    </script>
    <body>
    <center><h1>405 Not Allowed</h1></center>
    <hr><center>nginx/1.19.10</center>
    </body>
    </html>
    Please guide me how to get airbyte's job information via api.
    s
    p
    • 3
    • 2
  • a

    Alexis Charrier

    07/29/2022, 3:03 PM
    Hello, guys trying to modify an existing connector. But I can't push my branch to github repo. I always get the error: Permission to airbytehq/airbyte.git denied to .... Am I missing something ? Do we need special permissions to push a branch to the github repo ?
    w
    • 2
    • 1
  • s

    Sebastian

    08/02/2022, 1:05 PM
    Hi. I'm new to airbyte (cloud) and doing a few experiments. Currently I wonder why I don't see a possibility to add (custom) transformations with source=S3 and destination=BigQuery. Why is this?
  • a

    Andreas Nigg

    07/27/2022, 12:46 PM
    Hey, I've created a custom connector and was able to use it in my airbyte installation. Is there a way to update/change the documentation url for this connector?
    m
    • 2
    • 5
  • a

    Ashley Baer

    08/01/2022, 2:21 PM
    Hi - I’ve reported an issue but it seems I’m unable to assign it to myself or add any labels, although the documentation suggests I should be able to assign it to myself. Anyone know what the problem might be?
    m
    • 2
    • 4
  • a

    Ashish Rai

    07/14/2022, 8:04 AM
    I am trying to deploy locally, but all I get is this warning in a loop - TemporalUtils(getTemporalClientWhenConnected):243 - Waiting for namespace default to be initialized in temporal... OS is amazon linux 2
    m
    • 2
    • 4
  • z

    Zaza Javakhishvili

    08/02/2022, 2:55 PM
    Hi guys, This is very important for us and can you help? https://github.com/airbytehq/airbyte/issues/15173
  • a

    Arun

    08/02/2022, 3:26 PM
    Hey Everyone, I am new to airbyte, it would be great if someone help me on this, I am trying to sync the data from POSTGRES to SNOWFLAKE of 10M rows table. But after pulled 600k rows got stuck on below error, 2022-08-02 035444 INFO i.a.w.g.DefaultReplicationWorker(lambda$getReplicationRunnable$6):325 - Records read: 698000 (6 GB) 2022-08-02 035445 destination > 2022-08-02 035445 INFO i.a.i.d.r.SerializedBufferingStrategy(flushWriter):93 - Flushing buffer of stream MTL_SYSTEM_ITEMS (200 MB) 2022-08-02 035445 destination > 2022-08-02 035445 INFO i.a.i.d.s.StagingConsumerFactory(lambda$flushBufferFunction$3):158 - Flushing buffer for stream MTL_SYSTEM_ITEMS (200 MB) to staging 2022-08-02 035445 destination > 2022-08-02 035445 INFO i.a.i.d.r.BaseSerializedBuffer(flush):131 - Wrapping up compression and write GZIP trailer data. 2022-08-02 035445 destination > 2022-08-02 035445 INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 - Finished writing data to 4802c05e-271e-4b8c-b4d2-953dda97865a16960591700195388958.csv.gz (200 MB) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.internal.SdkFilterInputStream.close(SdkFilterInputStream.java:99) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.event.ProgressInputStream.close(ProgressInputStream.java:211) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.util.IOUtils.closeQuietly(IOUtils.java:70) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.http.AmazonHttpClient$RequestExecutor.closeQuietlyForRuntimeExceptions(AmazonHttpClient.java:767) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:760) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:719) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:701) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:669) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:651) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:515) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4443) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4390) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.services.s3.AmazonS3Client.doUploadPart(AmazonS3Client.java:3395) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.services.s3.AmazonS3Client.uploadPart(AmazonS3Client.java:3380) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.services.s3.transfer.internal.UploadPartCallable.call(UploadPartCallable.java:33) 2022-08-02 035753 destination > at net.snowflake.client.jdbc.internal.amazonaws.services.s3.transfer.internal.UploadPartCallable.call(UploadPartCallable.java:23) 2022-08-02 035753 destination > at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) 2022-08-02 035753 destination > at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) 2022-08-02 035753 destination > at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) 2022-08-02 035753 destination > at java.base/java.lang.Thread.run(Thread.java:833) 2022-08-02 035753 destination >
  • a

    Arun

    08/02/2022, 3:27 PM
    is there any settings limitation which we need to modify?
    • 1
    • 1
  • y

    Yash Makwana

    08/02/2022, 6:18 PM
    Hi Team, I'm having this error while running source acceptance test And this is my catalog.json { "stream": { "name": "contact", "json_schema": {}, "supported_sync_modes": [ "full_refresh" ] }, "sync_mode": "full_refresh", "destination_sync_mode": [ "append" ], "primary_key": [ [ "id" ] ] } can someone point out my mistake?
1...565758...245Latest