stable-diffusion-v1-5-mnn
介绍(Introduction)
此模型是使用mnn-diifusion-export从stable-diffusion-v1-5导出的8bit量化版本的MNN模型。
# 针对OpenCL后端优化版本
python3 convert_mnn.py onnx_path mnn_save_path "--weightQuantBits=8 --transformerFuse"
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.