This message was deleted.
# ask-for-help
s
This message was deleted.
c
Hi Amar, I assume you mean Nvidia TensorRT?
Yes it is supported in the ongoing BentoML & Triton Integration work, where user can use the TensorRT backend in Triton
BentoML itself also supports TensorRT, you can use the TensorRT’s python API to load and run a model via custom runner
a
@Chaoyu I meant triton inference sever
c
got it, a beta version is coming out in the next release
a
Thanks @Chaoyu! Any idea when can we expect the next release ?