DeusImperator
commited on
Commit
•
3c22490
1
Parent(s):
77bb5d6
Upload README.md
Browse files
README.md
CHANGED
@@ -13,6 +13,26 @@ license: other
|
|
13 |
|
14 |
![Dark-Miqu.png](Dark-Miqu.png)
|
15 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
A "dark" creative writing model with 32k context. Based off [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) but with greatly reduced "positivity" and "-isms". If you want happy endings, look elsewhere!
|
17 |
|
18 |
This model **excels** at writing Dark/Grimdark fantasy (see examples below).
|
|
|
13 |
|
14 |
![Dark-Miqu.png](Dark-Miqu.png)
|
15 |
|
16 |
+
# Dark-Miqu-70B - EXL2 2.4bpw
|
17 |
+
|
18 |
+
This is a 2.4bpw EXL2 quant of [jukofyork/Dark-Miqu-70B](https://huggingface.co/jukofyork/Dark-Miqu-70B)
|
19 |
+
|
20 |
+
This quant was made using exllamav2-0.0.21 with default dataset and settings.
|
21 |
+
|
22 |
+
This quant fits around 24k context on 24GB VRAM on Windows in my local testing (with exl2 Q4 cache), you might be able to get more depending on other things taking VRAM.
|
23 |
+
|
24 |
+
I tested this quant shortly in some random RPs (including ones over 8k and 20k context) and it seems to work fine.
|
25 |
+
|
26 |
+
## Prompt Templates
|
27 |
+
|
28 |
+
This model uses Mistral, Alpaca and Vicuna formats.
|
29 |
+
|
30 |
+
For more details see original readme below
|
31 |
+
|
32 |
+
### Original readme below
|
33 |
+
|
34 |
+
---
|
35 |
+
|
36 |
A "dark" creative writing model with 32k context. Based off [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) but with greatly reduced "positivity" and "-isms". If you want happy endings, look elsewhere!
|
37 |
|
38 |
This model **excels** at writing Dark/Grimdark fantasy (see examples below).
|