This message was deleted.
# ask-for-help
s
This message was deleted.
j
It seems that object is not picklable. I think custom runner is what you want here. https://github.com/bentoml/BentoML/tree/main/examples/custom_model_runner
h
@Jiang I see the tutorial you linked loads a model by
mnist_model = bentoml.pytorch.get("mnist_cnn:latest")
at the beginning. But I am trying to make this model now even before runnable but pickling isnt working. Maybe I am not entirely following what you suggested?
j
The key is just skip normal
save
and
load
steps in bentoml. Just includes the file in your project and load it directly in custom runner with the original API (fasttext.load_model here).
1
h
Thank you for the advice! Will look into that!