Sean
09/27/2022, 11:14 PMv1.0.6
featuring the gRPC preview! Without changing a line of code, you can now serve your Bento as a gRPC service. Similar to serving over HTTP, BentoML gRPC supports all the ML frameworks, observability features, adaptive batching, and more out-of-the-box, simply by calling the serve-grpc
CLI command.
> pip install "bentoml[grpc]"
> bentoml serve-grpc iris_classifier:latest --production
• Checkout our updated tutorial for a quick 10-minute crash course of BentoML gRPC.
• Review the standardized Protobuf definition of service APIs and IO types, NDArray, DataFrame, File/Image, JSON, etc.
• Learn more about multi-language client support (Python, Go, Java, Node.js, etc) with working examples.
• Customize gRPC service by mounting new servicers and interceptors.
⚠️ gRPC is current under preview. The public APIs may undergo incompatible changes in the future patch releases until the official v1.1.0
minor version release.
• Enhanced access logging format to output Trace and Span IDs in the more standard hex encoding by default.
• Added request total, duration, and in-progress metrics to Runners, in addition to API Servers.
• Added support for XGBoost SKLearn models.
• Added support for restricting image mime types in the Image IO descriptor.
🥂 We’d like to thank our community for their contribution and support.
• Shout out to @benjamintanweihao for fixing a BentoML CLI bug.
• Shout out to @lsh918 for mixing a PyTorch framework issue.
• Shout out to @jeffthebear for enhancing the Pandas DataFrame OpenAPI schema.
• Shout out to @jiewpeng for adding the support for customizing access logs with Trace and Span ID formats.
📣 We’d also like to invite everyone to join a Slack AMA session with the Co-Founder and Chief Product Officer of Arize, @Aparna Dhinakaran, on Thursday, September 29, 1-2pm PST/4-5pm EST. Looking forward to seeing everyone here!Amar Ramesh Kamat
10/05/2022, 1:02 AMSean
10/05/2022, 2:56 AM