This message was deleted.
# ask-for-help
s
This message was deleted.
🍱 1
a
iirc you can arbitrary pass any kwargs into the pipeline, and they have a specific sequence. All of the kwargs will be received by the model first, then tokenizer, then preprocessor. Maybe you can try to see what happens if you just pass those arguments in the pipeline? The reason we currently only support pipeline in 1.0 is that in 0.13, we will essentially have to maintain all of the custom logics among all of the saving options, models, tokenizer, and pipelines. This is error-prone and we recognized for 1.0, we might want to reduce our scope. Have you had a chance to take a look at transformers’ custom pipeline? It seems like it might fit your usecase here. We will discuss more internally. Thanks for the feedback.
s
@Aaron Pham Thank you for your reply.
iirc you can arbitrary pass any kwargs into the pipeline, and they have a specific sequence
I looked at the source of TokenClassificationPipeline. It doesn't support the additional argument I mentioned. I don't see custom pipeline in transformer 4.6. It looks complicated. The transformer upgrade from 4.6 to the latest version (4.26.1) is not trivial. I find pipeline is restrictive and not helpful in most cases. I wish we were not required to use pipeline in 1.x. I am unable to move forward with BentoML 1.x without a lot of time and effort investment. Plus, our team is a small team. So, I am basically stuck with BentoML 0.13 for hugging face models 😞
b
@Shihgian Lee This is something we are working on: improve existing capabilities. let me work with the team and see what we can do
s
Thank you @Bo! Please let me know if there is something we can assist with.