This message was deleted.
# ask-for-help
s
This message was deleted.
🍱 1
a
Hey there, it seems like we don’t have support for this usecase. What you can do is to save the processor and the FlaxWhisper separately with
bentoml.transformers.save_model
, then you probably will need to implement a custom runner to use these two. Additionally, this use case is fun to implement because we actually don’t have support for TPU just yet, so you also have to implement a custom strategies for it as well.
a
Ok thank you very much for the answer ! I guess I will go for the custom runner then.. I don't think I will use it on TPU just yet so that's not an issue for now