This message was deleted.
# ask-for-help
s
This message was deleted.
🏁 1
🍱 1
b
Hello. Can you share me those logs?
Here is an example with keras and mlflow
m
Yes wen i run model serve --production
2022-12-20 190441.595163: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory 2022-12-20 190441.595387: I tensorflow/compiler/xla/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine. 2022-12-20 190443.830589: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory 2022-12-20 190443.831485: W tensorflow/compiler/xla/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory 2022-12-20 190443.831763: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly. 2022-12-20T190445+0000 [INFO] [cli] Service loaded from Bento directory: bentoml.Service(tag="kitchenware-classification:lahxnat75kvehrft", path="/home/bentoml/bento/") 2022-12-20T190445+0000 [INFO] [cli] Environ for worker 0: set CPU thread count to 2
Did you get this run successfully locally?
m
Yes
b
Can you share with me your bentofile.yaml
It seems like you might not include some files
m
Yes I think maybe I'm not including something necessary in the bentofile.
service: "service:svc" description: "file: ./README.md" labels: owner: mary.orihuela stage: dev include: - "*.py" # A pattern for matching which files to include in the bento python: packages: # Additional pip packages required by the service - numpy - tensorflow - pillow
Did I also send you service.py?
a
Hi there, what is the issue that you run into when
containerize
?
m
Hi, I don't get errors with "containerize". The errors start when I run it with docker run <model> serve —production. And works locally but have errors.
there you can see it from the bentoml build
a
can you send your
service.py
?
m
yes
l
Hi Marilina, do you specify cuda version in your bentofile.yaml? The error message seems that cuda is required but not found. Please refer our documentation about GPU support in docker image here: https://docs.bentoml.org/en/latest/concepts/bento.html#gpu-support