Sean
12/07/2022, 8:37 PMv1.0.11
is here featuring the introduction of an inference collection and model monitoring API that can be easily integrated with any model monitoring frameworks.
• Introduced the bentoml.monitor
API for monitoring any features, predictions, and target data in numerical, categorical, and numerical sequence types.
import bentoml
from <http://bentoml.io|bentoml.io> import Text
from <http://bentoml.io|bentoml.io> import NumpyNdarray
CLASS_NAMES = ["setosa", "versicolor", "virginica"]
iris_clf_runner = bentoml.sklearn.get("iris_clf:latest").to_runner()
svc = bentoml.Service("iris_classifier", runners=[iris_clf_runner])
@svc.api(
input=NumpyNdarray.from_sample(np.array([4.9, 3.0, 1.4, 0.2], dtype=np.double)),
output=Text(),
)
async def classify(features: np.ndarray) -> str:
with bentoml.monitor("iris_classifier_prediction") as mon:
mon.log(features[0], name="sepal length", role="feature", data_type="numerical")
mon.log(features[1], name="sepal width", role="feature", data_type="numerical")
mon.log(features[2], name="petal length", role="feature", data_type="numerical")
mon.log(features[3], name="petal width", role="feature", data_type="numerical")
results = await iris_clf_runner.predict.async_run([features])
result = results[0]
category = CLASS_NAMES[result]
mon.log(category, name="pred", role="prediction", data_type="categorical")
return category
• Enabled monitoring data collection through log file forwarding using any forwarders (fluentbit, filebeat, logstash) or OTLP exporter implementations.
◦ Configuration for monitoring data collection through log files.
monitoring:
enabled: true
type: default
options:
log_path: path/to/log/file
• Configuration for monitoring data collection through an OTLP exporter.
monitoring:
enable: true
type: otlp
options:
endpoint: <http://localhost:5000>
insecure: true
credentials: null
headers: null
timeout: 10
compression: null
meta_sample_rate: 1.0
• Supported third-party monitoring data collector integrations through BentoML Plugins. See bentoml/plugins repository for more details.
🐳 Improved containerization SDK and CLI options, read more in #3164.
• Added support for multiple backend builder options (Docker, nerdctl, Podman, Buildah, Buildx) in addition to buildctl (standalone buildkit builder).
• Improved Python SDK for containerization with different backend builder options.
import bentoml
bentoml.container.build("iris_classifier:latest", backend="podman", features=["grpc","grpc-reflection"], **kwargs)
• Improved CLI to include the newly added options.
import bentoml
bentoml.container.build("iris_classifier:latest", backend="podman", features=["grpc","grpc-reflection"], **kwargs)
• Standardized the generated Dockerfile in bentos to be compatible with all build tools for use cases that require building from a Dockerfile directly.
💡 We continue to update the documentation and examples on every release to help the community unlock the full power of BentoML.
• Learn more about inference data collection and model monitoring capabilities in BentoML.
• Learn more about the default metrics that comes out-of-the-box and how to add custom metrics in BentoML.