This message was deleted.
# announcements
s
This message was deleted.
đź’Ż 1
đź’ˇ 1
🙏 1
👍 1
đź’Ş 1
🔥 3
❤️ 2
🎉 14
🍱 12
🚀 2
s
Can you share the use cases for this exciting new feature?
c
@Shihgian Lee this new feature is more of a quality of life change for our users, a couple things I really like about this are: 1. Notebook usage! You can now launch a Bento server from notebook without blocking the notebook execution. It returns a handle that you can use for interacting with the server and displaying the results. This works on jupyter and google colab! 2. It makes it easy to build customized inference scripts. You can now write a simple script that read in input features, run it through a long running inference job, and save the result, utilizing this API. You may have noticed that our implementation for Spark batch inference is based on this API as well. 3. it makes it easy to customize the server init script, e.g. if you want to use a custom init script that launches background jobs, run streamlit side by side, or a log daemon, you can now write a python script to initialize all that and then call this API to start the bentoml server This API exposes BentoML’s core serving functionality in a more flexible Python API, I can see our community building other new features on top as well!
❤️ 3
c
Do you have an example of how to use this in production? Could we create the server then run it with uvicorn like
uvicorn main:server
? Is there anything else we need to do to ensure it is suitable for production use?
c
The behavior of the API is actually identical to the cli command “BentoML serve —production”, performance wise it is considered applying all the best practices that BentoML does and suitable for production.
j
This should be really nice for integration testing, too!
👍 2
c
@Jim Rohrer yes exactly!
c
So
bentoml serve --production
isn't using uvicorn? Or is
server = HTTPServer("iris_classifier:latest", production=True, port=3000)
somehow using uvirocn under the hood?
I did some digging - I can see the http_api_server is actually a uvicorn server internally
👍 1
c
both are using uvicorn under the hood
👍 1