https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • h

    Hiroto Yamakawa

    05/03/2022, 11:47 AM
    Hi, Is it possible to sort the source/destination on the UI, or add a tag to group/filter them ?
    ✅ 1
    a
    • 2
    • 2
  • g

    Gangeshwar Krishnamurthy

    05/03/2022, 12:12 PM
    Hello Airbyte team, I am trying to sync Amplitude data to Postgres DB. I have setup a new connection but the sync failed. Here’s the log file. Please advice how to fix this? Thank you!
    logs-967-0.txt
    ✅ 1
    a
    • 2
    • 2
  • h

    Hiroto Yamakawa

    05/03/2022, 1:36 PM
    Hi, I found this thread when looking for information on Pre-filtering columns before ingestion. My team needs to extract some data from tables located in Salesforce or S3, but some columns must not be ingested in the first place into the DWH for security/legal reasons. • What are the current alternatives to achieve this? • Is it a feature Airbyte would consider to integrate in the future? (EDIT : found this link through the forum, guess it wont be in the roadmap anytime soon 😞, any suggestions are welcome 🙏 ) https://github.com/airbytehq/airbyte/issues/5541 ) Many thanks !
    ✅ 1
    a
    • 2
    • 2
  • j

    Jeff

    05/03/2022, 2:02 PM
    HI all 🙂 For google Cloud SQL, I need to authorize an IP address to allow access to the database. How can I know what IP airbyte cloud would connect from?
    ✅ 1
    m
    • 2
    • 3
  • y

    Yash Makwana

    05/03/2022, 2:22 PM
    Hi All, I want to import data from an API and I followed the documents for creating a custom source connector. I got through the spec, check and discover steps. I got this error while reading data. Can someone help me fix this?
    ✅ 1
    a
    • 2
    • 1
  • b

    Bryan Post

    05/03/2022, 7:58 PM
    Hello! I’m a Data Engineer and excited to try OSS Airbyte. I’ve read the docs and searched the discussion forum but can’t find a concrete answer for our two major use cases: • We use Snowflake, and authenticate with RSA tokens or Okta SSO (external browser). Will this be possible in Airbyte? If we needed to connect to Airbyte via AOuth 2.0, what would we have to do? • We have a lot of Hive tables that stored in AWS S3, and partitioned by date. Would this S3 configuration work with Airbyte as a source and/or destination?
    ✅ 1
    a
    • 2
    • 2
  • j

    Jordan Stein

    05/03/2022, 8:46 PM
    Hey y'all, just getting started with Airbyte. Was wondering if there was any updated on supporting deployment with ECS? We run all our services on ECS and don't want to also support running things on EC2, so will probably wait to deploy airbyte until this is available.
    ✅ 1
    a
    • 2
    • 2
  • s

    Stéphan Taljaard

    05/04/2022, 8:26 AM
    Hi! I'm new to Airbyte - I took a quick look if it will suit my use case. It seems not? Details in the thread.
    ✅ 1
    r
    a
    • 3
    • 4
  • h

    Hiroto Yamakawa

    05/04/2022, 9:06 AM
    Hi, anyone was able to upload a Json file using the '*File connector*' from a S3 url?
    ✅ 1
    a
    • 2
    • 1
  • s

    Stuart Middleton

    05/04/2022, 11:27 AM
    Newbie question relating to the use of the API. We're trying to use the API to extract the logs. We had success retrieving the server and scheduler logs and have identified that the logs for the job are located in the job details (api/v1/jobs/get), but this API call requires us to get a list of job IDs (using api/v1/jobs/list) which requires a configid and I'm struggling to find where I can get this configid from. Guidance please?
    ✅ 1
    a
    • 2
    • 2
  • j

    Julian Toledo Cuenca

    05/04/2022, 1:26 PM
    Hi team! I’m new to Airbyte and very excited about the product! I’m testing it in order to use it in my company. However, I’m having troubles connecting the open source solution to a local MongoDB container. I’m able to set up the source without errors and explore collections, although when syncing data the pipeline throws an error saying
    Source process exited with non-zero exit code 1
    . Does this have happened to you? Thank you!
    ✅ 1
    a
    • 2
    • 1
  • l

    Lorena Berrón Cadenas

    05/04/2022, 8:52 PM
    Hi team! I'm new to airbyte. I'm testing the connection to SurveyMonkey and I have the same error message. I suspect it might be that I'm using a free individual account for SM and can't create private apps, but before I pay for the upgrade, I want to confirm there's no way to make it work without it. In case anyone can help, thank you!
    m
    g
    • 3
    • 2
  • c

    Connor Lough

    05/04/2022, 10:55 PM
    Hey all, with S3 as a destination, all the S3 Path Format variables reference the time "that the sync was writing output" Is there any way to get the timestamp for the entire sync start? I'd like to make an S3 folder for the whole sync (currently 120 files) and then reference the most recent sync as opposed to the most recent file from the sync
    m
    • 2
    • 2
  • c

    Charlie Shou

    05/05/2022, 1:00 AM
    im using airbyte to export data from postgres, mysql, etc. into an s3 bucket. it works well but takes ~2 minutes even when the datasets are relatively small. is there a way to speed this up? why does it take this long to sync up sources?
    m
    • 2
    • 1
  • c

    Charlie Shou

    05/05/2022, 1:04 AM
    using
    /v1/connections/sync
    to trigger the connection and
    /v1/jobs/get
    to retrieve the status of the job
  • v

    Vishnu K

    05/05/2022, 6:25 AM
    Hi, Kudos Team, Need help here 🚁 I'm trying to do a PoC on DMS using airbyte with MSSQL as the source and Postgres AWS RDS as a destination, however the datatypes like int and timestamp are getting converted to float and string respectively,
    ✅ 1
    a
    • 2
    • 1
  • a

    Akshay Agarwal

    05/05/2022, 7:45 AM
    how does airbyte fetches tables for a particular schema? I see only two tables are being fetched for my postgres source. pls help
    ✅ 1
    a
    • 2
    • 1
  • j

    Joey Taleño

    05/05/2022, 12:27 PM
    Hi Airbyte Team, I have a successful connection between our source Postgres DB and destination BigQuery. However, I am getting this error... I really need help... 🙏
    a
    • 2
    • 4
  • b

    Brandon Barclay

    05/05/2022, 4:15 PM
    Has anybody setup Caddy with ssl. I need to setup ssl to make this work. Thanks
  • k

    Kyle Hancock

    05/05/2022, 5:53 PM
    Hi All, trying to setup a MSSQL to DynamoDB connection. I have the source and destination setup successfully but when I try to sync I get this error:
    Copy code
    2022-05-05 17:51:32 destination > 2022-05-05 17:51:32 INFO i.a.i.d.d.DynamodbWriter(close):161 - Data writing completed for DynamoDB.
    2022-05-05 17:51:32 destination > 2022-05-05 17:51:32 ERROR i.a.i.d.d.DynamodbWriter(close):159 - Cannot invoke "java.util.Collection.size()" because the return value of "com.amazonaws.services.dynamodbv2.document.TableWriteItems.getItemsToPut()" is null
    Any thoughts?
    h
    • 2
    • 1
  • g

    Garrett McClintock

    05/05/2022, 8:52 PM
    My initial run of sources is working, but subsequent are failing wtih this error (normalization is successful, but the json schema validation fails). This has happened on both JIRA and GitHub connectors, any ideas?
    Copy code
    2022-05-05 20:47:48 normalization > 20:47:48  Finished running 354 incremental models in 599.35s.
    2022-05-05 20:47:48 normalization > 20:47:48  
    2022-05-05 20:47:48 normalization > 20:47:48  Completed successfully
    2022-05-05 20:47:48 normalization > 20:47:48  
    2022-05-05 20:47:48 normalization > 20:47:48  Done. PASS=354 WARN=0 ERROR=0 SKIP=0 TOTAL=354
    2022-05-05 20:47:48 INFO i.a.w.DefaultNormalizationWorker(run):71 - Normalization executed in 10 minutes 52 seconds.
    2022-05-05 20:47:48 INFO i.a.w.DefaultNormalizationWorker(run):77 - Normalization summary: io.airbyte.config.NormalizationSummary@68251b7b[startTime=1651783016153,endTime=1651783668940]
    2022-05-05 20:47:48 INFO i.a.w.t.TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling...
    2022-05-05 20:47:48 INFO i.a.w.t.TemporalUtils(withBackgroundHeartbeat):236 - Stopping temporal heartbeating...
    2022-05-05 20:47:49 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $.access_token: is missing but it is required, $.refresh_token: is missing but it is required
    2022-05-05 20:47:49 INFO i.a.v.j.JsonSchemaValidator(test):56 - JSON schema validation failed. 
    errors: $.method: does not have a value in the enumeration [Standard]
    🙏 1
    h
    • 2
    • 2
  • b

    ByungHo Lee

    05/06/2022, 2:37 AM
    Start Airbyte First of all, make sure you have Docker and Docker Compose installed. Then run the following commands: git clone https://github.com/airbytehq/airbyte.git cd airbyte docker-compose up I ran the above, but got the following error: ERROR: Invalid interpolation format for "environment" option in service "worker": "CONFIG_DATABASE_PASSWORD=${CONFIG_DATABASE_PASSWORD:-}"
    h
    • 2
    • 2
  • b

    ByungHo Lee

    05/06/2022, 2:38 AM
    What should I do?
  • r

    Radim Sasinka

    05/06/2022, 12:11 PM
    Hello everyone. Im experimenting with octavia-cli, and I have an issue with generating connection between two postgres databases. Creation of source and destionation works fine, but when I generate yaml file for connection the ending looks like this
    Copy code
    syncCatalog: # OPTIONAL | object | 🚨 ONLY edit streams.config, streams.stream should not be edited as schema cannot be changed.
        null
        ...
    This is causing error while trying to do
    octavia apply
    . Can somebody help me with proper values/template? Or is there any other template I can use. See stack error here https://pastebin.com/uW6hqivv Thanks!
    ✅ 1
    a
    • 2
    • 6
  • c

    Christopher Brunton

    05/06/2022, 3:59 PM
    Just pushed my first Airbyte connection to prod, PostGres to Snowflake! I had to share somewhere 🤷
    ✋ 1
    fiesta parrot 6
    m
    • 2
    • 1
  • s

    Sami RIAHI

    05/06/2022, 5:00 PM
    Hello everyone i try to re-build a connector and i got this error can you please help ?
    m
    • 2
    • 3
  • s

    Sterling Paramore

    05/06/2022, 11:15 PM
    I need to build an integration for QuickBase, and I’m considering AirByte as the integration framework. Unfortunately, I’m having a bit of trouble getting started because the tutorial page is blank!
    m
    • 2
    • 4
  • a

    Andres Carral

    05/07/2022, 1:40 AM
    I'm taking my first steps with Airbyte Cloud to connect MongoDb with Bigquery. I get the source and the destination, but I can't make the connection. When you do the "fetching the schema" ... I tried granting read-only permission to each collection individually and on the other hand with find and list_collections... but I can't move forward. I appreciate any help. Good evening.
    m
    • 2
    • 2
  • m

    mohd shaikh

    05/08/2022, 7:09 AM
    Hey Everyone, I'm new to Airbyte. I am exploring it currently. I am looking for some good resources for setting up connection between different source like facebook ads,mysql,etc to Google bigquery. Can anyone help me to get some start. TIA!!
  • a

    Athanasios Spyriou

    05/09/2022, 10:20 AM
    Hello everyone I am new to Airbyte. Could anyone please let me know how to deal with the following error
    Copy code
    Cannot publish to S3: Storage backend has reached its minimum free disk threshold. Please delete a few objects to proceed. (Service: Amazon S3; Status Code: 507; Error Code: XMinioStorageFull;
    How could I add my own bucket? thanks!
    m
    • 2
    • 2
1...394041...245Latest