Mixtral-8x22B-v0.1 / README.md
leafspark's picture
Update README.md
d7bb05c verified
|
raw
history blame
381 Bytes
---
license: apache-2.0
---
# Mixtral-8x22b
New MoE model by MistralAI
Model weights are not uploaded yet, should be up by tommorow. After safetensors are finished gguf quants will be uploaded soon (assuming if I have enough resources).
Magnet link and checksum: [https://twitter.com/mistralai/status/1777869263778291896](https://twitter.com/mistralai/status/1777869263778291896)