This message was deleted.
# ask-for-help
s
This message was deleted.
💾 1
🏁 1
a
Hi there, bentoml.Context is a 1.0 feature. The latest documents are referred to bentoml 1.0
z
thanks for letting me know @Aaron Pham!
c
@Zi Wen Lim for 0.13, you may use the InferenceTask object: https://docs.bentoml.org/en/v0.13.1/concepts.html#defining-a-batch-api
👍 1
z
thanks @Chaoyu! Taking a look
a
Please do let us know if you still have question on this.
👍 1
z
A quick follow up question, would InferenceTask object work with
StringInput()
and
batch=False
in the predict function? I see the docs and it uses JSON input and `batch=true.`I currently have inputs passed into predict as a csv string i.e
'3.000000,3.000000,3.000000,3.000000\n1.000000,1.000000,1.000000,1.000000\n2.000000,2.000000,2.000000,2.000000'
for the iris classifier model that takes in 4 inputs (there are 3 rows here, hence expecting 3 inference outputs of like
1,0,0
a
AFAIK It should work, cc @Chaoyu
z
thanks for the help guys, it works!
a
sounds great!