https://bentoml.com logo
Join Slack
Powered by
# announcements
  • c

    Chaoyu

    06/27/2019, 1:45 AM
    hi everyone <!here> , we launched a contest on 99 design to find a logo for BentoML project. There are over 70 submissions so far, thought you guys might be interested to see this. Let us know if there’s one in the list that you really like! https://99designs.com/contests/925243/entries
  • b

    Bo

    06/29/2019, 4:12 AM
    https://99designs.com/logo-design/contests/open-source-project-logo-925243
  • b

    Bo

    06/29/2019, 4:12 AM
    Come check out our logo finalists
  • b

    Bo

    06/29/2019, 4:12 AM
    Vote which one you like!
  • b

    Bo

    06/29/2019, 4:59 AM
    Sorry. This is the poll address.
  • b

    Bo

    06/29/2019, 4:59 AM
    https://99designs.com/contests/poll/f82eedae5b
  • d

    Dheeraj Gupta

    11/15/2019, 5:45 AM
    Hello
  • b

    Bo

    11/15/2019, 5:45 AM
    Hi 👋
  • d

    Dheeraj Gupta

    11/15/2019, 5:45 AM
    I have deployed tabular model using bentoml
  • d

    Dheeraj Gupta

    11/15/2019, 5:46 AM
    when I am sending the Curl I am getting Internal server error
  • d

    Dheeraj Gupta

    11/15/2019, 5:46 AM
    Value Error : If using all scalar value you must pass an index
  • d

    Dheeraj Gupta

    11/15/2019, 5:47 AM
    I have my notebook in Azure Machine and I am running it locally there
  • b

    Bo

    11/15/2019, 5:51 AM
    Ok. You trained the model in notebook and running it locally on the azure machine. Is that correct? Could you tell me what command you used for deploying? And after you packaged the bento bundle, did you import it and run it successfully against test data?
  • d

    Dheeraj Gupta

    11/15/2019, 5:53 AM
    I followed the tabular example
  • d

    Dheeraj Gupta

    11/15/2019, 5:53 AM
    of fastai
  • c

    Chaoyu

    11/15/2019, 5:56 AM
    It maybe because the http request input is not in the right format, could you share the curl command?
  • c

    Chaoyu

    11/15/2019, 5:56 AM
    And you are using DataframeHandler right?
  • d

    Dheeraj Gupta

    11/15/2019, 6:01 AM
    curl -X POST http://127.0.0.1:5000/predict \ -H 'Content-Type: application/json' \ -d '[{ "ID": "AL011980", "Name": "UNNAMED", "Status": "TD", "Latitude": 29.8, "Longitude": -80.2, "Maximum.Wind": 25, "date_time": "1980-07-18 000000", "diff": 5, "rapid_int": 0, "i": 5, "n": 17, "persistence": 0, "product": 0, "Initial.Max": 25, "speed": 17.0712, "speed_z": 0.0351582, "speed_m": 9.26624, "Jday": 54, "Maximum.Wind_p": 25, "Latitude_p": 29.9, "Longitude_p": -79.3 }]'
  • d

    Dheeraj Gupta

    11/15/2019, 6:01 AM
    I am using this curl command
  • d

    Dheeraj Gupta

    11/15/2019, 6:01 AM
    I am running it on different data set
  • d

    Dheeraj Gupta

    11/15/2019, 6:04 AM
    I haven't change any thing in the dataframe handler part
  • d

    Dheeraj Gupta

    11/15/2019, 6:06 AM
    I am able to successfully install the tabular model
  • d

    Dheeraj Gupta

    11/15/2019, 6:06 AM
    I can hit localhost in the browser and see its running on port 5000
  • c

    Chaoyu

    11/15/2019, 6:58 AM
    @Dheeraj Gupta could you also share your “predict” function code?
  • s

    Slackbot

    11/15/2019, 7:00 AM
    This message was deleted.
    c
    • 2
    • 1
  • t

    tony hung

    11/20/2019, 6:10 PM
    hi everyone, I just found out about bentoML. is it possible to perform training as well as predict?
  • c

    Chaoyu

    11/20/2019, 6:11 PM
    hi @tony hung, do you mean perform training as the API server receive a prediction request? Or you are looking for something to help with structure training code that integrates with BentoML?
  • t

    tony hung

    11/20/2019, 6:15 PM
    the ideal scenario is a way to train the model at any given time. Im currently working on a fastai model and i need a way to retrain the model at any point in time.Say if there’s a new training example in a S3 bucket, i’d like to retrain with that new training example
  • t

    tony hung

    11/20/2019, 6:17 PM
    i tried using Sagemaker but I had errors when trying to install dependencies (librosa)
  • c

    Chaoyu

    11/20/2019, 6:24 PM
    Got it, sounds like you would like to be able to reload a new model as the API Server is running in production?
12345...11Latest