This message was deleted.
# ask-for-help
s
This message was deleted.
👀 1
🏁 1
l
Hi Steven, did you save the model with
batchable=True
like here: https://docs.bentoml.org/en/latest/frameworks/transformers.html#adaptive-batching ?
s
@larme (shenyang) Yes I did. Do I need to remove it?
l
could you try:
Copy code
result = await runner.async run([inputs .dict ()])
then change your
preprocess
function to handle a list of dict
or you can just set
batchable=False
s
I see. Let me try. But I also wanna ask if I set it to True, does that mean I need to handle the batch inference in the predict function?
l
Yep. If your model is more efficient when doing batch inference, then I suggest you try to find a way to make it work. Else you may just turn off batching
s
Cool. I just tried. And it act as I expect now. Thanks so much! @larme (shenyang)
l
welcome
💯 1