Update README.md
Browse files
README.md
CHANGED
@@ -29,13 +29,12 @@ The model should handle 25-32k context window size.
|
|
29 |
|
30 |
[ko-fi To buy sweets for my cat :3](https://ko-fi.com/icefog72)
|
31 |
|
32 |
-
##
|
33 |
-
Exl2 Quants
|
34 |
>- [4.2bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-4.2bpw-exl2)
|
35 |
>- [6.5bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-6.5bpw-exl2)
|
36 |
>- [8bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-8bpw-exl2)
|
37 |
|
38 |
-
thx mradermacher for GGUF
|
39 |
>- [GGUF](https://huggingface.co/mradermacher/IceSakeRP-7b-GGUF)
|
40 |
>- [i1-GGUF](https://huggingface.co/mradermacher/IceSakeRP-7b-i1-GGUF)
|
41 |
|
|
|
29 |
|
30 |
[ko-fi To buy sweets for my cat :3](https://ko-fi.com/icefog72)
|
31 |
|
32 |
+
## Exl2 Quants
|
|
|
33 |
>- [4.2bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-4.2bpw-exl2)
|
34 |
>- [6.5bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-6.5bpw-exl2)
|
35 |
>- [8bpw-exl2](https://huggingface.co/icefog72/IceSakeRP-7b-8bpw-exl2)
|
36 |
|
37 |
+
## thx mradermacher for GGUF
|
38 |
>- [GGUF](https://huggingface.co/mradermacher/IceSakeRP-7b-GGUF)
|
39 |
>- [i1-GGUF](https://huggingface.co/mradermacher/IceSakeRP-7b-i1-GGUF)
|
40 |
|