Update README.md
Browse files
README.md
CHANGED
@@ -1,10 +1,12 @@
|
|
1 |
---
|
2 |
-
base_model:
|
3 |
-
- mistralai/Mistral-Small-24B-Instruct-2501
|
4 |
library_name: transformers
|
5 |
license: apache-2.0
|
6 |
---
|
7 |
|
|
|
|
|
|
|
8 |
**Arcee-Blitz (24B)** is a new Mistral-based 24B model distilled from DeepSeek, designed to be both **fast and efficient**. We view it as a practical “workhorse” model that can tackle a range of tasks without the overhead of larger architectures.
|
9 |
|
10 |
### Quantizations
|
|
|
1 |
---
|
2 |
+
base_model: arcee-ai/Arcee-Blitz
|
|
|
3 |
library_name: transformers
|
4 |
license: apache-2.0
|
5 |
---
|
6 |
|
7 |
+
6bpw exl2 quant of: https://huggingface.co/arcee-ai/Arcee-Blitz
|
8 |
+
|
9 |
+
---
|
10 |
**Arcee-Blitz (24B)** is a new Mistral-based 24B model distilled from DeepSeek, designed to be both **fast and efficient**. We view it as a practical “workhorse” model that can tackle a range of tasks without the overhead of larger architectures.
|
11 |
|
12 |
### Quantizations
|