https://bentoml.com logo
Join Slack
Powered by
# announcements
  • d

    Deepak Dhiman

    03/11/2022, 12:33 PM
    Thanks
  • s

    Suhas

    03/11/2022, 12:33 PM
    Try with this version, it should work. Version: 1.0.0a6 @Deepak Dhiman
  • d

    Deepak Dhiman

    03/11/2022, 12:36 PM
    @Suhas I tried with these version. I get below error:
  • s

    Slackbot

    03/11/2022, 12:36 PM
    This message was deleted.
    j
    • 2
    • 1
  • d

    Deepak Dhiman

    03/11/2022, 12:36 PM
    Am I doing something wrong
  • s

    Suhas

    03/11/2022, 12:39 PM
    https://pypi.org/project/bentoml/#history Check here for versions, your problem might be because of your OS version. @Deepak Dhiman
    ✅ 1
  • d

    Deepak Dhiman

    03/11/2022, 12:41 PM
    It worked @Suhas. Thank you 🙂
  • s

    Slackbot

    03/16/2022, 1:05 PM
    This message was deleted.
    s
    t
    a
    • 4
    • 3
  • s

    Slackbot

    03/21/2022, 12:21 PM
    This message was deleted.
    a
    • 2
    • 2
  • s

    Slackbot

    03/22/2022, 8:08 PM
    This message was deleted.
    a
    a
    c
    • 4
    • 6
  • s

    Slackbot

    03/24/2022, 7:21 AM
    This message was deleted.
    m
    • 2
    • 1
  • s

    Suhas

    03/31/2022, 3:32 PM
    Hello, can anyone confirm pytorch=1.9.0 and transformers=4.17.0 is supported by bentoml
  • s

    Slackbot

    03/31/2022, 3:50 PM
    This message was deleted.
    c
    s
    t
    • 4
    • 9
  • c

    Chaoyu

    04/05/2022, 9:00 PM
    set the channel topic: This channel will also be used for general announcements from BentoML team. Follow us on twitter! https://twitter.com/bentomlai
  • c

    Chaoyu

    04/06/2022, 12:24 AM
    set the channel topic: This channel is used for general announcements from BentoML team.
  • c

    Chaoyu

    04/06/2022, 12:24 AM
    set the channel description: Follow us on twitter! https://twitter.com/bentomlai
  • c

    Chaoyu

    04/06/2022, 1:32 AM
    set the channel topic: This channel is used for general announcements from BentoML team. For support and general questions, please use the #support channel.
  • c

    Chaoyu

    04/06/2022, 7:54 PM
    Hi <!channel>! BentoML 1.0.0a7 has just been released with a number of improvements and bug fixes. The two most significant changes are: • BREAKING CHANGE: Default serving port has been changed to 3000 ◦ This is due to an issue with new MacOS where 5000 port is always in use. ◦ This will affect default serving port when deploying with Docker. Existing 1.0 preview release users will need to either change deployment config to use port 3000, or pass
    --port 5000
    to the container command, in order to use the previous default port setting. • New import/export API is available now! ◦ Users can now export models and bentos from local store to a standalone file ◦ Lean more via
    bentoml export --help
    and
    bentoml models export --help
    We’ve also recently released Yatai version 0.2.1, with major refactoring around the deployment controller. Users can now create Bento deployments directly via
    kubectl
    and Kubernetes resource YAML file, in addition to the Yatai Web UI and REST API:
    Copy code
    # my_deployment.yaml
    apiVersion: <http://serving.yatai.ai/v1alpha1|serving.yatai.ai/v1alpha1>
    kind: BentoDeployment
    metadata:
      name: demo
    spec:
      bento_tag: iris_classifier:3oevmqfvnkvwvuqj
      resources:
        limits:
          cpu: 1000m
        requests:
          cpu: 500m
    Apply deployment to your cluster:
    Copy code
    kubeclt apply -f my_deployment.yaml
    This will make it easy for DevOps to customize BentoML deployments on Kubernetes cluster, with additional k8s resources such as credentials, db, policies and other services. On the
    bentoctl
    project, we are working on a major new version which embraces a workflow based on terraform, to simplify deploying Bentos to any cloud platforms, such as AWS EC2, Lambda, Sagemaker, Azure, GCP, Heroku, etc. If you are interested in learning more or help with beta testing, definitely chat with @Bo and @jjmachan.
    🙌 20
    💯 7
    👍 8
  • s

    Slackbot

    04/07/2022, 11:24 AM
    This message was deleted.
    t
    • 2
    • 2
  • s

    Slackbot

    04/11/2022, 9:10 AM
    This message was deleted.
    t
    • 2
    • 1
  • t

    Tim Liu

    04/11/2022, 4:35 PM
    Hello BentoML community! We're happy to announce that we've released our first blog post! This first piece talks about the broad set of benefits offered by BentoML. In the future we'll be deep diving into lots of different topics. Would love your support, thanks! Twitter Post Linkedin Post
    🙌 6
    🎉 7
  • s

    Slackbot

    04/12/2022, 6:25 AM
    This message was deleted.
    t
    • 2
    • 1
  • s

    Slackbot

    04/14/2022, 6:34 AM
    This message was deleted.
    🙏 1
    t
    • 2
    • 1
  • s

    Slackbot

    04/15/2022, 2:52 PM
    This message was deleted.
    t
    • 2
    • 1
  • t

    Tim Liu

    04/25/2022, 3:44 PM
    Our intern @Spencer Churchill just wrote up a nice blog post on model reproducibility and the concerns that BentoML helps address out of the box: https://l.bentoml.com/4196 Cool story too about building a covid cough prediction model!
    👍 5
    👏 2
    ❤️ 6
  • m

    minjune kim

    04/26/2022, 11:07 PM
    Question: Is there a way to call an instance of
    BentoService
    directly as a python object, instead of utilizing http call? I have the source for bentoml packing and I am trying to write a simulation test, but doing it with http call is too slow.
  • m

    minjune kim

    04/26/2022, 11:07 PM
    BentomlService.inference_apis[0].infer
    seems interesting but can't seem to make it work. on 0.13 btw
  • m

    minjune kim

    04/26/2022, 11:10 PM
    Oh figured it out, thanks all! Just needed to call the api function directly.
    🎉 4
  • s

    Slackbot

    04/29/2022, 1:07 AM
    This message was deleted.
    c
    • 2
    • 2
  • d

    Deepak Sharma

    05/05/2022, 10:45 AM
    hey guys, im trying to deploy an already trained & exported model onto bentos server for model serving
1...7891011Latest