This message was deleted.
# ask-for-help
s
This message was deleted.
👍 1
a
oh sorry, this is def a bug. Let me put up a PR for this. Thanks for spotting this.
For now what you can do is to save the tokenizer as a custom object
Copy code
bentoml.transformers.save_model("model", model, custom_objects={"tokenizer": tokenizer})
And you can access this tokenizer like so
Copy code
bentomodel = bentoml.models.get("model")

tokenizer = bentomodel.custom_objects['tokenizer']
n
Thank you.. I will it. 👍
a
I notice that you are also running save inside service.py. This is usually discouraged because
service.py
will be forked to child process when running
bentoml serve
, and you probably don’t want to save the model everytime you serve the service.
Usually save should be done after training, but for the case of transformers, you can create a separate file to run before the actual serving time
n
Oh, Thank you for information. I will use separate file for it.