https://linen.dev logo
Join Slack
Powered by
# ask-community-for-troubleshooting
  • a

    Aditya

    05/13/2021, 5:56 AM
    Facing issues connecting to mssql as source. I was able to connect to sql server using the same hostname and port using other apps like studio3T but Airbyte is throwing an error. I also turned my firewall off.
    ✅ 1
    d
    a
    • 3
    • 3
  • r

    Rahul Padhy

    05/13/2021, 7:41 AM
    Hey guys, my question is about column names generated for BigQuery tables (in my case, BigQuery wareshouse is the destination) by Airbyte - is there any way to co-erce the column names to be similar to those generated by FiveTran as described here - https://fivetran.com/docs/getting-started/core-concepts#namingconventions
    ✅ 1
    d
    • 2
    • 17
  • p

    Ping

    05/13/2021, 10:34 PM
    Hi there! I'm trying to migrate from Airbyte 0.14.1-alpha to 0.22.3-alpha, which appears to be the current version. However, I'm getting "No migration found for target version: 0.22.3-alpha".
    Copy code
    [ec2-user@ip-172-31-45-53 ~]$ docker run --rm -v /home/ec2-user/0.14.1-alpha:/config airbyte/migration:0.22.3-alpha -- --input /config/airbyte_archive.tar.gz --output /config/airbyte_archive_migrated.tar.gz --target-version 0.22.3-alpha
    2021-05-13 22:31:38 INFO i.a.m.MigrationRunner(parse):78 - {} - args: [--input, /config/airbyte_archive.tar.gz, --output, /config/airbyte_archive_migrated.tar.gz, --target-version, 0.22.3-alpha]
    2021-05-13 22:31:38 INFO i.a.m.MigrationRunner(run):50 - {} - Unpacking tarball
    2021-05-13 22:31:38 INFO i.a.m.MigrationRunner(run):67 - {} - Running migrations...
    2021-05-13 22:31:38 INFO i.a.m.MigrationRunner(run):68 - {} - MigrateConfig{inputPath=/tmp/airbyte_migrate1347150974477562104/uncompressed, outputPath=/tmp/airbyte_migrate1347150974477562104/output, targetVersion='0.22.3-alpha'}
    2021-05-13 22:31:39 INFO i.a.m.Migrate(run):88 - {} - Starting migrations. Current version: 0.14.1-alpha, Target version: 0.22.3-alpha
    Exception in thread "main" java.lang.IllegalArgumentException: No migration found for target version: 0.22.3-alpha
            at com.google.common.base.Preconditions.checkArgument(Preconditions.java:142)
            at io.airbyte.migrate.Migrate.run(Migrate.java:101)
            at io.airbyte.migrate.MigrationRunner.run(MigrationRunner.java:70)
            at io.airbyte.migrate.MigrationRunner.main(MigrationRunner.java:108)
    ✅ 1
    c
    • 2
    • 32
  • p

    Ping

    05/14/2021, 12:04 AM
    Okay, migration looked good; all my connections came back. But they don't want to start running; the Sync History says "Pending" and
    Waiting for logs...
    The console output shows this error repeating over and over:
    Copy code
    airbyte-scheduler   | 2021-05-14 00:04:23 ERROR i.a.c.l.Exceptions(swallow):84 - {workspace_app_root=/tmp/workspace/scheduler/logs} - Swallowed error.
    airbyte-scheduler   | java.lang.IllegalArgumentException: Job type getSpec is not allowed!
    airbyte-scheduler   |   at com.google.common.base.Preconditions.checkArgument(Preconditions.java:142) ~[guava-30.1-jre.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.persistence.job_tracker.JobTracker.lambda$trackSync$3(JobTracker.java:129) ~[io.airbyte.airbyte-scheduler-persistence-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.commons.lang.Exceptions.swallow(Exceptions.java:82) [io.airbyte-airbyte-commons-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.persistence.job_tracker.JobTracker.trackSync(JobTracker.java:126) [io.airbyte.airbyte-scheduler-persistence-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.JobSubmitter.trackSubmission(JobSubmitter.java:129) [io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.JobSubmitter.lambda$run$0(JobSubmitter.java:71) [io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at java.util.Optional.ifPresent(Optional.java:176) [?:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.JobSubmitter.run(JobSubmitter.java:70) [io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.SchedulerApp.lambda$start$0(SchedulerApp.java:137) [io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
    airbyte-scheduler   |   at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
    airbyte-scheduler   |   at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
    airbyte-scheduler   |   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) [?:?]
    airbyte-scheduler   |   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?]
    airbyte-scheduler   |   at java.lang.Thread.run(Thread.java:832) [?:?]
    airbyte-scheduler   | 2021-05-14 00:04:23 ERROR i.a.s.a.JobSubmitter(run):78 - {workspace_app_root=/tmp/workspace/scheduler/logs} - Job Submitter Error
    airbyte-scheduler   | java.lang.IllegalArgumentException: Does not support job type: GET_SPEC
    airbyte-scheduler   |   at io.airbyte.scheduler.app.worker_run.TemporalWorkerRunFactory.createSupplier(TemporalWorkerRunFactory.java:78) ~[io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.worker_run.TemporalWorkerRunFactory.create(TemporalWorkerRunFactory.java:55) ~[io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.JobSubmitter.submitJob(JobSubmitter.java:84) ~[io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.JobSubmitter.lambda$run$0(JobSubmitter.java:72) ~[io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at java.util.Optional.ifPresent(Optional.java:176) ~[?:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.JobSubmitter.run(JobSubmitter.java:70) [io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at io.airbyte.scheduler.app.SchedulerApp.lambda$start$0(SchedulerApp.java:137) [io.airbyte.airbyte-scheduler-app-0.22.3-alpha.jar:?]
    airbyte-scheduler   |   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
    airbyte-scheduler   |   at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
    airbyte-scheduler   |   at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
    airbyte-scheduler   |   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) [?:?]
    airbyte-scheduler   |   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?]
    airbyte-scheduler   |   at java.lang.Thread.run(Thread.java:832) [?:?]
    c
    • 2
    • 17
  • a

    Aleksandr

    05/14/2021, 8:44 AM
    Hi! I have a question. Now i set up connection. Source - my db from docker, destination - another empty db in same container. After sync I see succeeded status, but I don't see the change in destination db. No errors / warnings in log. Perhaps I am doing something wrong?
    ✅ 1
    d
    c
    u
    • 4
    • 9
  • a

    Adhithya Ravichandran

    05/14/2021, 4:11 PM
    Hi I’ve started to test something locally. I tried adding a source which is a MongoDB Atlas cluster in our dev environment. I’m able to connect to it from my local programmatically. I have tried different things. Including specifying the mongo replica set. additional details: we’re behind a cloud VPN. Might it affect things as airbyte is containerized? log text:
    Copy code
    DefaultAirbyteStreamFactory(lambda$create$0):73 - W, [2021-05-14T16:07:49.870271 #8] WARN -- : MONGODB | Error running ismaster on <redacted server name>.<http://mongodb.net:27017|mongodb.net:27017>: SocketError: getaddrinfo: Name does not resolve
    2021-05-14 16:08:50 INFO (/tmp/workspace/c0bdcfee-0a4d-48c0-8de1-560e9908d59b/0) DefaultAirbyteStreamFactory(internalLog):110 - [2021-05-14 16:07:50 +0000] WARN : | "MONGODB | Error running ismaster on <redacted server name>.<http://mongodb.net:27017|mongodb.net:27017>: SocketError: getaddrinfo: Name does not resolve"
    2021-05-14 16:08:50 INFO (/tmp/workspace/c0bdcfee-0a4d-48c0-8de1-560e9908d59b/0) TemporalAttemptExecution(get):134 - Stopping cancellation check scheduling...
    Help is greatly appreciated 🙂
    👀 1
    u
    s
    • 3
    • 15
  • m

    Matt Fleshman

    05/17/2021, 3:40 PM
    I used to work with Fivetran primarily (left because of price), so I'm trying to pick up more of the technicals of ELT now. Trying to get clearer on Incremental v. Incremental Append. My understanding is normal Incremental just syncs only new rows incrementally, usually decided by using a primary key, whereas Incremental Append syncs all rows in a short recent time period every sync, which creates duplicates in your warehouse. My understanding with something like Salesforce, because of their robust primary key, it usually wont need to be Incremental Append. Stitch for example doesnt append only this table. The documentation also just says Incremental, but there is still an option in the configuration to dedupe. Does a source like this actually end up needing to be deduped? https://docs.airbyte.io/integrations/sources/salesforce
    ✅ 1
    c
    j
    • 3
    • 6
  • m

    Mané Rom

    05/17/2021, 3:53 PM
    Good afternoon. We are starting on airbyte. Our desire is to be able to attach this awesome tool to our web platform. Is there some docs or examples available about how could it be the airbyte implementation at NodeJs environment? We work with Vue + NodeJs. Thanks in advance
    s
    • 2
    • 4
  • a

    Anton Nosovitsky

    05/18/2021, 8:55 AM
    Am i supposed to see something else in the logs once the server is ready? Have I missed something?
    ✅ 1
    c
    • 2
    • 9
  • a

    Anton Nosovitsky

    05/18/2021, 10:02 AM
    Is there any way to know where in the sync a connector is?
    😛 1
    d
    • 2
    • 11
  • r

    Rahul Padhy

    05/18/2021, 12:25 PM
    Hey everyone, hope you all are doing well. I needed some help in locating the code wherein column names are changed during the normalization step by Airbyte. Can anyone please point me to that portion of the airbyte repo on Github, was looking to do some custom changes myself - any help would be really appreciated! 🙂
    ✅ 1
    c
    • 2
    • 13
  • p

    Priyanka MCP

    05/19/2021, 3:48 PM
    Hi All, Facing below issue while doing gradle build> Task airbyte cliairbyteDocker FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task 'airbyte cliairbyteDocker'.
    A problem occurred starting process 'command 'E:\Priyanka\workspace\Airbyte\airbyte\tools\bin\build_image.sh''
    Any pointers please
    ✅ 1
    s
    • 2
    • 1
  • w

    Warren Gaspay

    05/20/2021, 8:08 AM
    Hi all, is there a way to test incremental syncing locally?
    d
    • 2
    • 2
  • r

    Rahul Padhy

    05/20/2021, 9:45 AM
    Needed some help in testing some changes that I made locally. So, I’m making some changes in the file airbyte/airbyte-integrations/bases/base-normalization/normalization/transform_catalog/destination_name_transformer.py in the function transform_standard_naming in line number 189, and if I want to debug my changes, so should I go for re-starting the wbe app hosted on docker or just start a new connection sync on the UI?
    ✅ 1
    d
    • 2
    • 16
  • a

    Alex Koshterek

    05/21/2021, 9:58 AM
    Checking upgrading docs, the process looks rather tricky and difficult to automate. Are there any plans to make configuration upgrade possible via UI / API? Like 1. Export configuration 2. install fresh Airbyte version (or spin up a new VM) 3. Import the exported file back 4. Migration happens automatically 5. ??? PROFIT I’ve made a
    packer
    script building a VM and switching to new Airbyte is a matter of 1 line change. Also are there e2e examples of API calls for setting up connections?
    ✅ 1
    c
    • 2
    • 4
  • j

    Jeff Crooks

    05/21/2021, 2:18 PM
    Having issues getting up and running on AWS EC2
    ✅ 1
    c
    • 2
    • 13
  • j

    John Gaspar

    05/22/2021, 6:11 AM
    everytime, it would say that it's successful on airbyte, but nothing's in postgres.
    ✅ 1
    c
    • 2
    • 42
  • j

    John Gaspar

    05/22/2021, 6:25 AM
    I'm using normalization on postgres's side as a destination. Is this not supported for this kind of transfer?
    c
    s
    • 3
    • 4
  • j

    John Gaspar

    05/22/2021, 7:26 AM
    Is mariadb already supported? https://github.com/airbytehq/airbyte/issues/2011 I just tested as per Chris's instructions, and it seemed to work.
    ✅ 1
    👍 1
    d
    • 2
    • 1
  • u

    UrbunMe

    05/22/2021, 10:15 AM
    Hi, is quickbooks desktop supported as a source ?
    ✅ 1
    d
    • 2
    • 1
  • r

    Rahul Padhy

    05/25/2021, 11:41 AM
    Hey everyone, hope you’re doing well. I’d a query wrt the Airbyte API. So, given a sourceDefinitionId, I’m trying to get a file-source as described here -> https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#post-/v1/source_definitions/get. For this, I’m using the requests library in python and sending post request using json. But, even after verifying all my inputs, I’m still getting that my JSON format is invalid. So, this is my basic code :-
    import requests
    BASE_URL = '<http://localhost:8001/api>'
    get_source = '/v1/sources/get'
    file_source = {"sourceDefinitionId": "778daa7c-feaf-4db6-96f3-70fd645acc77"}
    full_url = BASE_URL + get_source
    response = <http://requests.post|requests.post>(full_url, json=file_source)
    print (response.text)
    And here’s the output that I’m getting :-
    {"message":"Invalid JSON","details":"Unrecognized field \"sourceDefinitionId\" (class io.airbyte.api.model.SourceIdRequestBody), not marked as ignorable"}
    I’d gotten the sourceDefinitionId for source type Files from the response of the list-all-sourceDefinitions endpoint -> https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#post-/v1/source_definitions/list Please do tell me where I’m going wrong.
    ✅ 1
    u
    • 2
    • 5
  • r

    Rahul Padhy

    05/25/2021, 2:12 PM
    What is the actual operation that’s performed when the following API endpoint is hit?
    ✅ 1
    c
    • 2
    • 12
  • k

    KRISHNA MALLIK

    05/25/2021, 2:57 PM
    Hi, I see that the HubSpot connector uses legacy api to fetch data. And, objects like contacts, campaigns, etc. (except email events and subscription changes) doesn't support incremental load yet. question 1) Is there a way to enable incremental load for those models? question 2) Will the new HubSpot connector start supporting OAuth soon? question 3) To HubSpot users: Is there a way to extract google analytics data (that HubSpot uses to calculate pay per click metrics) using the HubSpot connector? Thanks!
    ✅ 1
    e
    • 2
    • 3
  • i

    Ievgenii Tsybaiev [GL]

    05/25/2021, 7:10 PM
    Hi there. I'm new to airbyte. Currently trying to follow a getting-started guide https://github.com/airbytehq/airbyte/blob/e378d40236b6a34e1c1cb481c8952735ec687d88/docs/quickstart/getting-started.md But when it comes to a connection creation - it fails (see attached capture). Could somebody please give an advice what went wrong and how to fix it? Does it happen to everyone? Many thanks in advance!
    👀 1
    u
    a
    • 3
    • 3
  • d

    Dan Spohn

    05/26/2021, 8:15 PM
    Hi there. I'm trying to setup my first connection in Airbyte. I am running the docker image of the most recent version (I just did a git pull before starting it). I am in the Onboarding section. I set my source to be "File" and I am attempting to pull a CSV file from an SFTP site, and I set my destination as Redshift. I believe that I may not have properly set the configuration for the CSV file, based on the error I am receiving. However, i can't figure out any way to go back in the process (or to restart the process) so that I can try to fix the CSV configuration. The Airbyte application continually brings me back to the page shown in the screenshot (http://localhost:8000/onboarding) (I've attempted to type different URLs, hit the back button, etc.). Or to say it another way, I am stuck on step #3 of Onboarding with an error, and I can't figure out how to go back to Step #1, or how to cancel the setup of this connection. Is there a way to go back in the process during Onboarding? or a way to restart Onboarding?
    u
    c
    • 3
    • 7
  • r

    Rahul Padhy

    05/27/2021, 8:12 AM
    Hey guys, I’ve been using airbyte to sync up my data from S3 source to dump into BigQuery. Now, I’ve made 6 different connectors that sync up to give data in 6 different tables in BigQuery in the same dataset. But 1 of the connectors is failing to sync up data for about the past couple of days. Interestingly, this connector was working fine before that. I skimmed through the logs and from there, I got to know that S3 endpoint URL for that source. But if that’s that the case, I’m unable to understand as to how things were working fine before that and if we keep the URI aside, the other 5 connectors have the same settings and they are working fine. I’m attaching the related snapshots and the log file below in this conversation thread.
    ✅ 1
    d
    u
    • 3
    • 23
  • r

    Ronen Yaari

    05/27/2021, 9:19 AM
    Hi, when I use "Local CSV" destination on a local deployment on windows how/where can I see the created files? Thanks
    c
    u
    h
    • 4
    • 9
  • r

    Rahul Padhy

    05/27/2021, 10:10 AM
    Hey everyone, I’ve been trying to make a connection using Airbyte API as listed here -> https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.html#post-/v1/connections/create However, if I mention the field ‘operationIds’ in the input schema, then it errors out saying that ‘operationIds’ is not a valid field-name, so I proceeded ahead without the field ‘operationIds’ and when I posted the data, I got status_code 200 for this. When I went back to the Airbyte UI, the sync operation had started by itself, and it was failing - citing the reason that Unexpected character (‘’' (code 39)): was expecting double-quote to start field name - if there were an error, then how come I got the response code 200? I’m attaching the logs below. Can anyone please suggest as to what I’m missing here?
    logs-160-2.txt
    ✅ 1
    s
    c
    +2
    • 5
    • 58
  • d

    Daniel Mateus Pires (Earnest Research)

    05/27/2021, 10:24 AM
    hey folks, I'm using Airbyte for the first time, trying to ingest from a REST API using the HTTP source. However the API returns as JSON array, and I'm getting
    Copy code
    2021-05-27 10:22:09 ERROR (/tmp/workspace/2/0) LineGobbler(voidCall):69 - data
    2021-05-27 10:22:09 ERROR (/tmp/workspace/2/0) LineGobbler(voidCall):69 - value is not a valid dict (type=type_error.dict)
    2021-05-27 10:22:10 INFO (/tmp/workspace/2/0) DefaultReplicationWorker(run):134 - Source thread complete.
    2021-05-27 10:22:10 INFO (/tmp/workspace/2/0) DefaultReplicationWorker(run):135 - Waiting for destination thread to join.
    2021-05-27 10:22:11 INFO (/tmp/workspace/2/0) DefaultReplicationWorker(run):137 - Destination thread complete.
    2021-05-27 10:22:11 ERROR (/tmp/workspace/2/0) DefaultReplicationWorker(run):141 - Sync worker failed.
    in the logs on version
    0.24.1-alpha
    u
    e
    • 3
    • 9
  • s

    Steve

    05/27/2021, 5:08 PM
    Hi, I wonder if anyone has any ideas about this please? I have Docker installed, I've followed the instructions at https://docs.airbyte.io/quickstart/deploy-airbyte but I'm not getting very far. 😉 macOS 11.0.1 Video and error output are attached here. Thanks,
    👀 1
    u
    e
    • 3
    • 7
12345...245Latest