It should work for GitHub - bes-dev/stable_diffusion.openvino at sb/onnxruntime, but I didn't test it.

Note: Not compatible with diffusers's ONNX pipeline.

In a private project I wrote myself, unet.onnx and vae_decoder.onnx are tested and confirmed to work.


Converted from the pre-trained weights wd-v1-3-full.ckpt in this repository: hakurei/waifu-diffusion-v1-3 路 Hugging Face.

The text encoder is from the original repository: bes-dev/stable-diffusion-v1-4-onnx 路 Hugging Face.

The computation graph for onnx in this repository is the same as in this repository:

bes-dev/stable-diffusion-v1-4-onnx 路 Hugging Face.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.