Update README.md
Browse files
README.md
CHANGED
@@ -3,6 +3,8 @@
|
|
3 |
## Description
|
4 |
2bit imatrix GGUF quants of [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B)
|
5 |
|
|
|
|
|
6 |
## Other quants:
|
7 |
EXL2: [3.5bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3.5bpw-exl2), [3bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3bpw-exl2), [2.4bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-2.4bpw-exl2)
|
8 |
|
@@ -23,5 +25,6 @@ GGUF:
|
|
23 |
### Response:
|
24 |
{reply}
|
25 |
```
|
|
|
26 |
## Contact
|
27 |
Kooten on discord
|
|
|
3 |
## Description
|
4 |
2bit imatrix GGUF quants of [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B)
|
5 |
|
6 |
+
Imatrix generated from q8 of MiquMaid, 2000 chunks at 500 ctx. Dataset was Wikitext.
|
7 |
+
|
8 |
## Other quants:
|
9 |
EXL2: [3.5bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3.5bpw-exl2), [3bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3bpw-exl2), [2.4bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-2.4bpw-exl2)
|
10 |
|
|
|
25 |
### Response:
|
26 |
{reply}
|
27 |
```
|
28 |
+
|
29 |
## Contact
|
30 |
Kooten on discord
|