YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Stable-Diffusion 3.5 Lite - Onnx Olive DirectML Optimized

Information:

This conversion uses int4 data type for the TextEncoder3, this drops VRAM requirement to between 8GB and 16GB However this will result in a slight quality drop compared to the base stable-diffusion-3.5-medium model

Original Model

https://huggingface.co/stabilityai/stable-diffusion-3.5-medium

C# Inference Demo

https://github.com/TensorStack-AI/OnnxStack

// Create Pipeline
var pipeline = StableDiffusion3Pipeline.CreatePipeline("D:\\Models\\stable-diffusion-3.5-lite-onnx");

// Prompt
var promptOptions = new PromptOptions
{
    Prompt = "Create a scene of a mystical shaman, with animal skins and feathers, performing a ritual in a sacred grove."
};

// Run pipeline
var result = await pipeline.GenerateImageAsync(promptOptions);

// Save Image Result
await result.SaveAsync("Result.png");

Inference Result

Intro Image

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.