# Stable-Diffusion 3.5 Lite - Onnx Olive DirectML Optimized ### Information: This conversion uses `int4` data type for the `TextEncoder3`, this drops VRAM requirement to between 8GB and 16GB However this will result in a slight quality drop compared to the base `stable-diffusion-3.5-medium` model ## Original Model https://huggingface.co/stabilityai/stable-diffusion-3.5-medium ## C# Inference Demo https://github.com/TensorStack-AI/OnnxStack ```csharp // Create Pipeline var pipeline = StableDiffusion3Pipeline.CreatePipeline("D:\\Models\\stable-diffusion-3.5-lite-onnx"); // Prompt var promptOptions = new PromptOptions { Prompt = "Create a scene of a mystical shaman, with animal skins and feathers, performing a ritual in a sacred grove." }; // Run pipeline var result = await pipeline.GenerateImageAsync(promptOptions); // Save Image Result await result.SaveAsync("Result.png"); ``` ## Inference Result ![Intro Image](Sample.png)