I think the stable diffusion is not optimized for batching at the moment.
Bo
10/06/2022, 8:06 PM
I would love to see your benchmark, if you are working on one
Bo
10/06/2022, 8:06 PM
And feel free to make PR for batching capacity
l
larme (shenyang)
10/06/2022, 10:56 PM
we disabled batching for these reasons:
1. in our offline testing batching seems not give us much performance gain
2. using fp32 model, when batch_size > 2, the inference process will blow up a 3080 ti, hence the batch size cannot be too large
3. inputs inside same batch can only have different prompts, but with the same other parameters. This is especially
inconvenient for img2img because it requires