GitHub
04/17/2024, 2:41 AM<https://github.com/bentoml/BentoML/tree/main|main>
by Sherlock113
<https://github.com/bentoml/BentoML/commit/2dbd57ae3ddb60a0a73694a6661f2a417d0e43ed|2dbd57ae>
- docs: Update example project list (#4673)
bentoml/BentoMLGitHub
04/17/2024, 3:08 AM<https://github.com/bentoml/BentoML/tree/main|main>
by Sherlock113
<https://github.com/bentoml/BentoML/commit/0dc552bcdb9f190f56666be3a12ebf23ab693d03|0dc552bc>
- docs: Add the monitoring and data collection doc (#4662)
bentoml/BentoMLGitHub
04/17/2024, 5:01 AM<https://github.com/bentoml/BentoML/tree/main|main>
by Sherlock113
<https://github.com/bentoml/BentoML/commit/e10f0ef4250b77dffe1c9db08011fc1b273e4708|e10f0ef4>
- docs: Add add_asgi_middleware doc (#4672)
bentoml/BentoMLGitHub
04/17/2024, 8:19 AMpre-commit run -a
script has passed (instructions)?
☐ Did you read through contribution guidelines and follow development guidelines?
☐ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☐ Did you write tests to cover your changes?
bentoml/BentoML
GitHub Actions: evergreen
GitHub Actions: report-coverage
GitHub Actions: bento_server_http-e2e-tests (python3.8.macos-latest)
GitHub Actions: unit-tests (python3.11.windows-latest)
✅ 26 other checks have passed
26/30 successful checksGitHub
04/17/2024, 8:46 AM<https://github.com/bentoml/BentoML/tree/main|main>
by bojiang
<https://github.com/bentoml/BentoML/commit/e50050a947f11b981753c51e3f17bd895b93bb13|e50050a9>
- fix: delete useless enum and fix enum value (#4674)
bentoml/BentoMLGitHub
04/18/2024, 5:25 AMpre-commit run -a
script has passed (instructions)?
☐ Did you read through contribution guidelines and follow development guidelines?
☑︎ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☐ Did you write tests to cover your changes?
bentoml/BentoML
GitHub Actions: evergreen
GitHub Actions: report-coverage
GitHub Actions: bento_server_http-e2e-tests (python3.8.macos-latest)
✅ 27 other checks have passed
27/30 successful checksGitHub
04/18/2024, 5:38 AM<https://github.com/bentoml/BentoML/tree/main|main>
by Sherlock113
<https://github.com/bentoml/BentoML/commit/b39fbcce2edbff811217a97190c6146162430733|b39fbcce>
- docs: Add RAG tutorial (#4675)
bentoml/BentoMLGitHub
04/18/2024, 7:18 AMpre-commit run -a
script has passed (instructions)?
☐ Did you read through contribution guidelines and follow development guidelines?
☑︎ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☐ Did you write tests to cover your changes?
bentoml/BentoML
✅ All checks have passed
3/3 successful checksGitHub
04/18/2024, 7:22 AM<https://github.com/bentoml/BentoML/tree/main|main>
by Sherlock113
<https://github.com/bentoml/BentoML/commit/f7a4b3a7e63436c93a596988f4de879960a9065b|f7a4b3a7>
- docs: Update the clients doc (#4676)
bentoml/BentoMLGitHub
04/18/2024, 7:22 AM<https://github.com/bentoml/BentoML/tree/main|main>
by Sherlock113
<https://github.com/bentoml/BentoML/commit/dc33ee9561ece96e0a19d9dcd7f7a3ea85bd60f5|dc33ee95>
- docs: Add some explanations for bentoml.models.get (#4660)
bentoml/BentoMLGitHub
04/18/2024, 8:16 AMpre-commit run -a
script has passed (instructions)?
☐ Did you read through contribution guidelines and follow development guidelines?
☐ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☐ Did you write tests to cover your changes?
bentoml/BentoML
GitHub Actions: evergreen
GitHub Actions: report-coverage
GitHub Actions: bento_server_http-e2e-tests (python3.8.macos-latest)
✅ 27 other checks have passed
27/30 successful checksGitHub
04/18/2024, 9:31 AMbentoml serve
after building my bento I get the following error:
Traceback (most recent call last):
File "/home/tom/Desktop/ml-reconciliation/venv/bin/bentoml", line 8, in <module>
sys.exit(cli())
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml_cli/utils.py", line 362, in wrapper
return func(*args, **kwargs)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml_cli/utils.py", line 333, in wrapper
return_value = func(*args, **kwargs)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/click/decorators.py", line 33, in new_func
return f(get_current_context(), *args, **kwargs)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml_cli/utils.py", line 290, in wrapper
return func(*args, **kwargs)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml_cli/env_manager.py", line 122, in wrapper
return func(*args, **kwargs)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml_cli/serve.py", line 260, in serve
serve_http_production(
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/simple_di/__init__.py", line 139, in _
return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml/serve.py", line 327, in serve_http_production
json.dumps(runner.scheduled_worker_env_map),
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml/_internal/runner/runner.py", line 356, in scheduled_worker_env_map
for worker_id in range(self.scheduled_worker_count)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml/_internal/runner/runner.py", line 341, in scheduled_worker_count
return self.scheduling_strategy.get_worker_count(
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml/_internal/runner/strategy.py", line 68, in get_worker_count
resource_request = system_resources()
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml/_internal/resource.py", line 46, in system_resources
res[resource_kind] = resource.from_system()
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/bentoml/_internal/resource.py", line 248, in from_system
pynvml.nvmlInit()
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/pynvml/nvml.py", line 1770, in nvmlInit
nvmlInitWithFlags(0)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/pynvml/nvml.py", line 1760, in nvmlInitWithFlags
_nvmlCheckReturn(ret)
File "/home/tom/Desktop/ml-reconciliation/venv/lib/python3.10/site-packages/pynvml/nvml.py", line 833, in _nvmlCheckReturn
raise NVMLError(ret)
pynvml.nvml.NVMLError_DriverNotLoaded: Driver Not Loaded
To reproduce
bentoml serve
Expected behavior
bentoml serve
running without error
Environment
BentoML: 1.1.11
Python: 3.10
torch: 2.2.1
Ubuntu: 22.04
no Nvidia GPU
bentoml/BentoMLGitHub
04/18/2024, 9:34 AMpre-commit run -a
script has passed (instructions)?
☐ Did you read through contribution guidelines and follow development guidelines?
☑︎ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☐ Did you write tests to cover your changes?
bentoml/BentoML
GitHub Actions: cleanup
✅ 2 other checks have passed
2/3 successful checksGitHub
04/18/2024, 10:39 AM<https://github.com/bentoml/BentoML/tree/main|main>
by Sherlock113
<https://github.com/bentoml/BentoML/commit/74936ed3a044951d9953bcf75af894be04c8fdcf|74936ed3>
- docs: Add e2e test doc (#4679)
bentoml/BentoMLGitHub
04/19/2024, 6:44 AMGitHub
04/19/2024, 7:16 AM<https://github.com/bentoml/BentoML/tree/main|main>
by bojiang
<https://github.com/bentoml/BentoML/commit/606c975019996a26ac1430bbd0aaed7148be569a|606c9750>
- fix(cloud client): various type error (#4680)
bentoml/BentoMLGitHub
04/19/2024, 7:56 AMGitHub
04/19/2024, 8:12 AM<https://github.com/bentoml/BentoML/tree/main|main>
by bojiang
<https://github.com/bentoml/BentoML/commit/bd47504d84d1792d40235bd204e79895209ffbd8|bd47504d>
- fix(cli): bentoml cli verbosity not passed to the subprocess correctly (#4661)
bentoml/BentoMLGitHub
04/19/2024, 10:29 AMGitHub
04/19/2024, 5:14 PMpre-commit run -a
script has passed (instructions)?
☑︎ Did you read through contribution guidelines and follow development guidelines?
☑︎ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☑︎ Did you write tests to cover your changes?
bentoml/BentoMLGitHub
04/22/2024, 2:53 PMbentoml==1.2.12
version
I'm following the docs on how to use a GRPC client
from __future__ import annotations
import asyncio
import logging
import numpy as np
import bentoml
async def async_run(client: bentoml.client.Client):
res = await client.async_classify(np.array([[5.9, 3, 5.1, 1.8]]))
<http://logger.info|logger.info>("Result from 'client.async_classify':\n%s", res)
res = await client.async_call("classify", np.array([[5.9, 3, 5.1, 1.8]]))
<http://logger.info|logger.info>("Result from 'client.async_call':\n%s", res)
def run(client: bentoml.client.Client):
res = client.classify(np.array([[5.9, 3, 5.1, 1.8]]))
<http://logger.info|logger.info>("Result from 'client.classify':\n%s", res)
res = client.call("classify", np.array([[5.9, 3, 5.1, 1.8]]))
<http://logger.info|logger.info>("Result from 'client.call(bentoml_api_name='classify')':\n%s", res)
if __name__ == "__main__":
import argparse
logger = logging.getLogger(__name__)
ch = logging.StreamHandler()
formatter = logging.Formatter("%(message)s")
ch.setFormatter(formatter)
logger.addHandler(ch)
logger.setLevel(logging.DEBUG)
parser = argparse.ArgumentParser()
parser.add_argument("-s", "--sync", action="store_true", default=False)
args = parser.parse_args()
c = bentoml.client.Client.from_url("localhost:3000", kind="grpc")
if args.sync:
run(c)
else:
asyncio.run(async_run(c))
Trying to execute the above ☝️ and getting the following exception
root@42ca13f78843:/tmp/bentomlclientexample# python client_grpc_example.py
Traceback (most recent call last):
File "/tmp/bentomlclientexample/client_grpc_example.py", line 46, in <module>
c = bentoml.client.Client.from_url("0.0.0.0:3000", kind="grpc")
File "/usr/local/lib/python3.10/site-packages/bentoml/_internal/client/__init__.py", line 109, in from_url
return SyncClient.from_url(server_url, kind=kind, **kwargs)
File "/usr/local/lib/python3.10/site-packages/bentoml/_internal/client/__init__.py", line 367, in from_url
return SyncGrpcClient.from_url(server_url, **kwargs)
File "/usr/local/lib/python3.10/site-packages/bentoml/_internal/client/grpc.py", line 728, in from_url
with GrpcClient._create_channel(
AttributeError: type object 'GrpcClient' has no attribute '_create_channel'
Looking at the code
class GrpcClient(Client):
def __init__(self, svc: Service, server_url: str):
self._sync_client = SyncGrpcClient(svc=svc, server_url=server_url)
self._async_client = AsyncGrpcClient(svc=svc, server_url=server_url)
super().__init__(svc, server_url)
Neither GrpClient
or Client
(defined here) have _create_channel
defined.
Not that crucial but also worth mentioning that kind="grpc"
arg is needed which isn't stated by the doc ☝️ (I had to dig into the code to get it right)
To reproduce
No response
Expected behavior
The examples provided in https://docs.bentoml.org/en/v1.1.11/guides/grpc.html should work
Environment
Environment variable
BENTOML_DEBUG=''
BENTOML_QUIET=''
BENTOML_BUNDLE_LOCAL_BUILD=''
BENTOML_DO_NOT_TRACK=''
BENTOML_CONFIG=''
BENTOML_CONFIG_OPTIONS=''
BENTOML_PORT=''
BENTOML_HOST=''
BENTOML_API_WORKERS=''
System information
`bentoml`: 1.2.12
`python`: 3.10.9
`platform`: Linux-5.10.76-linuxkit-aarch64-with-glibc2.31
`uid_gid`: 0:0
pip_packages
absl-py==1.3.0
aiohttp==3.9.5
aiosignal==1.3.1
annotated-types==0.6.0
anyio==4.3.0
appdirs==1.4.4
asgiref==3.8.1
astunparse==1.6.3
async-timeout==4.0.3
attrs==23.2.0
bentoml==1.2.12
build==1.2.1
cachetools==5.2.0
cattrs==23.1.2
certifi==2022.12.7
charset-normalizer==2.1.1
circus==0.18.0
click==8.1.7
click-option-group==0.5.6
cloudpickle==3.0.0
contextlib2==21.6.0
deepmerge==1.1.1
Deprecated==1.2.14
exceptiongroup==1.2.1
flatbuffers==1.12
frozenlist==1.4.1
fs==2.4.16
gast==0.4.0
google-auth==2.15.0
google-auth-oauthlib==0.4.6
google-pasta==0.2.0
grpcio==1.51.1
h11==0.14.0
h5py==3.7.0
httpcore==1.0.5
httpx==0.27.0
idna==3.4
importlib-metadata==6.11.0
inflection==0.5.1
Jinja2==3.1.3
keras==2.9.0
Keras-Preprocessing==1.1.2
libclang==14.0.6
Markdown==3.4.1
markdown-it-py==3.0.0
MarkupSafe==2.1.1
mdurl==0.1.2
multidict==6.0.5
numpy==1.24.1
nvidia-ml-py==11.525.150
oauthlib==3.2.2
opentelemetry-api==1.20.0
opentelemetry-instrumentation==0.41b0
opentelemetry-instrumentation-aiohttp-client==0.41b0
opentelemetry-instrumentation-asgi==0.41b0
opentelemetry-sdk==1.20.0
opentelemetry-semantic-conventions==0.41b0
opentelemetry-util-http==0.41b0
opt-einsum==3.3.0
packaging==22.0
pathspec==0.12.1
pip-requirements-parser==32.0.1
pip-tools==7.4.1
prometheus_client==0.20.0
protobuf==3.19.6
psutil==5.9.8
pyasn1==0.4.8
pyasn1-modules==0.2.8
pydantic==2.7.0
pydantic_core==2.18.1
Pygments==2.17.2
pyparsing==3.1.2
pyproject_hooks==1.0.0
python-dateutil==2.9.0.post0
python-json-logger==2.0.7
python-multipart==0.0.9
PyYAML==6.0.1
pyzmq==26.0.2
requests==2.28.1
requests-oauthlib==1.3.1
rich==13.7.1
rsa==4.9
schema==0.7.5
simple-di==0.1.5
six==1.16.0
sniffio==1.3.1
starlette==0.37.2
tensorboard==2.9.1
tensorboard-data-server==0.6.1
tensorboard-plugin-wit==1.8.1
tensorflow==2.9.2
tensorflow-estimator==2.9.0
tensorflow-io-gcs-filesystem==0.29.0
termcolor==2.1.1
tomli==2.0.1
tomli_w==1.0.0
tornado==6.4
typing_extensions==4.11.0
urllib3==1.26.13
uvicorn==0.29.0
watchfiles==0.21.0
Werkzeug==2.2.2
wrapt==1.14.1
yarl==1.9.4
zipp==3.18.1
bentoml/BentoMLGitHub
04/22/2024, 3:33 PMFile "C:\Users\Path\lib\site-packages\uvicorn\server.py", line 140, in startup
sock = socket.fromfd(config.fd, socket.AF_UNIX, socket.SOCK_STREAM)
AttributeError: module 'socket' has no attribute 'AF_UNIX'
I tried to change the socket attribute to AF_INET, the error messages disappear but the client cannot connect to the bentoml server.
Thanks,
To reproduce
No response
Expected behavior
No response
Environment
bentoml:1.2.12
python:3.9.18
uvicorn:0.29.0
Windows: 11 Pro 22H2
bentoml/BentoMLGitHub
04/23/2024, 1:19 AMpre-commit run -a
script has passed (instructions)?
☐ Did you read through contribution guidelines and follow development guidelines?
☐ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☐ Did you write tests to cover your changes?
bentoml/BentoML
GitHub Actions: bento_server_http-e2e-tests (python3.11.macos-latest)
✅ 29 other checks have passed
29/30 successful checksGitHub
04/23/2024, 3:23 AM-"cmake==3.29.2"
- "dlib==19.24.4"
- "scikit-image==0.22.0"
- "face-recognition==1.3.0"
- "opencv-python==4.9.0.80"
- "numpy==1.26.2"
- "tensorflow==2.16.1"
- "mtcnn==0.1.1"
- "bentoml==1.2.11"
bentoml/BentoMLGitHub
04/23/2024, 9:31 AMThis API will not respect any 'bentofile.yaml' files. Build options should instead be provided
via function call parameters.
Args:
service: import str for finding the bentoml.Service instance build target
labels: optional immutable labels for carrying contextual info
description: optional description string in markdown format
include: list of file paths and patterns specifying files to include in Bento,
default is all files under build_ctx, beside the ones excluded from the
exclude parameter or a :code:`.bentoignore` file for a given directory
exclude: list of file paths and patterns to exclude from the final Bento archive
docker: dictionary for configuring Bento's containerization process, see details
in :class:`bentoml._internal.bento.build_config.DockerOptions`
python: dictionary for configuring Bento's python dependencies, see details in
:class:`bentoml._internal.bento.build_config.PythonOptions`
conda: dictionary for configuring Bento's conda dependencies, see details in
:class:`bentoml._internal.bento.build_config.CondaOptions`
version: Override the default auto generated version str
build_ctx: Build context directory, when used as
_bento_store: save Bento created to this BentoStore
Returns:
Bento: a Bento instance representing the materialized Bento saved in BentoStore
Example:
.. code-block::
import bentoml
bentoml.build(
service="fraud_detector.py:svc",
version="any_version_label", # override default version generator
description=open("README.md").read(),
include=['*'],
exclude=[], # files to exclude can also be specified with a .bentoignore file
labels={
"foo": "bar",
"team": "abc"
},
python=dict(
packages=["tensorflow", "numpy"],
# requirements_txt="./requirements.txt",
index_url="http://<api token>:@mycompany.com/pypi/simple",
trusted_host=["<http://mycompany.com|mycompany.com>"],
find_links=['thirdparty..'],
extra_index_url=["..."],
pip_args="ANY ADDITIONAL PIP INSTALL ARGS",
wheels=["./wheels/*"],
lock_packages=True,
),
docker=dict(
distro="amazonlinux2",
setup_script="setup_docker_container.sh",
python_version="3.8",
),
)
"""
When I use the this code I'm getting the error:
bug: module 'bentoml' has no attribute 'build'
To reproduce
No response
Expected behavior
a bento builded
Environment
bentoml: 1.2.12
python: 3.10
* * *
Environment variable
BENTOML_DEBUG=''
BENTOML_QUIET=''
BENTOML_BUNDLE_LOCAL_BUILD=''
BENTOML_DO_NOT_TRACK=''
BENTOML_CONFIG=''
BENTOML_CONFIG_OPTIONS=''
BENTOML_PORT=''
BENTOML_HOST=''
BENTOML_API_WORKERS=''
System information
`bentoml`: 1.2.12
`python`: 3.10.13
`platform`: Linux-5.15.0-1057-azure-x86_64-with-glibc2.31
`uid_gid`: 1004:1004
`conda`: 24.1.2
`in_conda_env`: True
conda_packages
name: py310
channels:
- defaults
dependencies:
- _libgcc_mutex=0.1=main
- _openmp_mutex=5.1=1_gnu
- asttokens=2.0.5=pyhd3eb1b0_0
- bzip2=1.0.8=h5eee18b_5
- ca-certificates=2024.3.11=h06a4308_0
- comm=0.2.1=py310h06a4308_0
- debugpy=1.6.7=py310h6a678d5_0
- decorator=5.1.1=pyhd3eb1b0_0
- exceptiongroup=1.2.0=py310h06a4308_0
- executing=0.8.3=pyhd3eb1b0_0
- ipykernel=6.28.0=py310h06a4308_0
- ipython=8.20.0=py310h06a4308_0
- jedi=0.18.1=py310h06a4308_1
- jupyter_client=8.6.0=py310h06a4308_0
- jupyter_core=5.5.0=py310h06a4308_0
- ld_impl_linux-64=2.38=h1181459_1
- libffi=3.4.4=h6a678d5_0
- libgcc-ng=11.2.0=h1234567_1
- libgomp=11.2.0=h1234567_1
- libsodium=1.0.18=h7b6447c_0
- libstdcxx-ng=11.2.0=h1234567_1
- libuuid=1.41.5=h5eee18b_0
- matplotlib-inline=0.1.6=py310h06a4308_0
- ncurses=6.4=h6a678d5_0
- nest-asyncio=1.6.0=py310h06a4308_0
- openssl=3.0.13=h7f8727e_0
- packaging=23.2=py310h06a4308_0
- parso=0.8.3=pyhd3eb1b0_0
- pexpect=4.8.0=pyhd3eb1b0_3
- pip=23.3.1=py310h06a4308_0
- platformdirs=3.10.0=py310h06a4308_0
- prompt-toolkit=3.0.43=py310h06a4308_0
- prompt_toolkit=3.0.43=hd3eb1b0_0
- psutil=5.9.0=py310h5eee18b_0
- ptyprocess=0.7.0=pyhd3eb1b0_2
- pure_eval=0.2.2=pyhd3eb1b0_0
- pygments=2.15.1=py310h06a4308_1
- python=3.10.13=h955ad1f_0
- python-dateutil=2.8.2=pyhd3eb1b0_0
- pyzmq=25.1.2=py310h6a678d5_0
- readline=8.2=h5eee18b_0
- setuptools=68.2.2=py310h06a4308_0
- six=1.16.0=pyhd3eb1b0_1
- sqlite=3.41.2=h5eee18b_0
- stack_data=0.2.0=pyhd3eb1b0_0
- tk=8.6.12=h1ccaba5_0
- tornado=6.3.3=py310h5eee18b_0
- traitlets=5.7.1=py310h06a4308_0
- tzdata=2024a=h04d1e81_0
- wcwidth=0.2.5=pyhd3eb1b0_0
- wheel=0.41.2=py310h06a4308_0
- xz=5.4.6=h5eee18b_0
- zeromq=4.3.5=h6a678d5_0
- zlib=1.2.13=h5eee18b_0
prefix: /opt/conda/envs/py310
pip_packages
```
accelerate==0.28.0
aiofiles==23.2.1
aiohttp==3.9.3
aiosignal==1.3.1
alembic==1.13.1
altair==5.2.0
annotated-types==0.6.0
antlr4-python3-runtime==4.9.3
anyio==4.3.0
appdirs==1.4.4
asgiref==3.8.0
asteroid-filterbanks==0.4.0
asttokens @ file:///opt/conda/conda-bld/asttokens_1646925590279/work
async-timeout==4.0.3
attrs==23.2.0
audio2numpy==0.1.2
audioread==3.0.1
auto_gptq==0.7.1
av==11.0.0
backoff==2.2.1
beautifulsoup4==4.12.3
bentoml==1.2.12
bitsandbytes==0.41.3.post2
blinker==1.7.0
boto3==1.34.67
botocore==1.34.67
bs4==0.0.2
build==0.10.0
cachetools==5.3.3
cattrs==23.1.2
certifi==2024.2.2
cffi==1.16.0
chardet==5.2.0
charset-normalizer==3.3.2
circus==0.18.0
click==8.1.7
click-option-group==0.5.6
cloudpickle==3.0.0
colorama==0.4.6
coloredlogs==15.0.1
colorlog==6.8.2
comm @ file:///croot/comm_1709322850197/work
contextlib2==21.6.0
contourpy==1.2.0
cryptography==42.0.5
ctranslate2==4.1.0
cuda-python==12.4.0
cupy-cuda12x==12.1.0
cycler==0.12.1
dataclasses-json==0.6.4
datasets==2.18.0
debugpy @ file:///croot/debugpy_1690905042057/work
decorator @ file:///opt/conda/conda-bld/decorator_1643638310831/work
deepmerge==1.1.1
Deprecated==1.2.14
deprecation==2.1.0
diffusers==0.27.2
dill==0.3.8
dirtyjson==1.0.8
diskcache==5.6.3
distlib==0.3.8
distro==1.9.0
docker==7.0.0
docopt==0.6.2
einops==0.7.0
exceptiongroup @ file:///croot/exceptiongroup_1706031385326/work
executing @ file:///opt/conda/conda-bld/executing_1646925071911/work
fastapi==0.110.0
fastcore==1.5.29
faster-whisper==1.0.0
fastrlock==0.8.2
ffmpeg==1.4
ffmpy==0.3.2
filelock==3.13.1
filetype==1.2.0
flash-attn==2.5.7
flatbuffers==24.3.25
fonttools==4.50.0
frozenlist==1.4.1
fs==2.4.16
fs-s3fs==1.1.1
fsspec==2024.2.0
ftfy==6.2.0
gekko==1.0.7
germansentiment==1.1.0
ghapi==1.0.4
gitdb==4.0.11
GitPython==3.1.42
googleapis-common-protos==1.56.2
gradio==4.22.0
gradio_client==0.13.0
greenlet==3.0.3
grpcio==1.62.1
grpcio-channelz==1.48.2
grpcio-health-checking==1.48.2
grpcio-reflection==1.48.2
h11==0.14.0
httpcore==1.0.4
httptools==0.6.1
httpx==0.27.0
huggingface-hub==0.22.2
humanfriendly==10.0
HyperPyYAML==1.2.2
idna==3.6
img2pdf==0.5.1
importlib-metadata==6.11.0
importlib_resources==6.4.0
inflection==0.5.1
interegular==0.3.3
ipykernel @ file:///croot/ipykernel_1705933831282/work
ipython @ file:///croot/ipython_1704833016303/work
jedi @ file:///tmp/build/80754af9/jedi_1644315229345/work
Jinja2==3.1.3
jmespath==1.0.1
joblib==1.3.2
jsonpatch==1.33
jsonpointer==2.4
jsonschema==4.21.1
jsonschema-specifications==2023.12.1
julius==0.2.7
jupyter_client @ file:///croot/jupy…
bentoml/BentoMLGitHub
04/23/2024, 10:23 AMpre-commit run -a
script has passed (instructions)?
☐ Did you read through contribution guidelines and follow development guidelines?
☐ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☐ Did you write tests to cover your changes?
bentoml/BentoML
✅ All checks have passed
3/3 successful checksGitHub
04/23/2024, 10:32 AM<https://github.com/bentoml/BentoML/tree/main|main>
by Sherlock113
<https://github.com/bentoml/BentoML/commit/cab59c8d9e2ae522e4b9e25ac0beb80b44e722bd|cab59c8d>
- docs: fix indentation in build option docs (#4688)
bentoml/BentoMLGitHub
04/24/2024, 11:42 AMpre-commit run -a
script has passed (instructions)?
☐ Did you read through contribution guidelines and follow development guidelines?
☐ Did your changes require updates to the documentation? Have you updated
those accordingly? Here are documentation guidelines and tips on writting docs.
☐ Did you write tests to cover your changes?
bentoml/BentoML
GitHub Actions: evergreen
GitHub Actions: report-coverage
GitHub Actions: bento_server_http-e2e-tests (python3.8.macos-latest)
✅ 27 other checks have passed
27/30 successful checksGitHub
04/24/2024, 12:15 PM<https://github.com/bentoml/BentoML/tree/main|main>
by frostming
<https://github.com/bentoml/BentoML/commit/a124df5a3b30fa2b3f938b900fde8840011d3906|a124df5a>
- fix: bug: module 'bentoml' has no attribute 'build' (#4689)
bentoml/BentoMLGitHub
04/24/2024, 1:07 PM