Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ license: apache-2.0
|
|
11 |
| [Taiwan-inquiry_7B_v2.1-Q5_K_M.gguf](https://huggingface.co/ChenWeiLi/Taiwan-inquiry_7B_v2.1.gguf/blob/main/Taiwan-inquiry_7B_v2.1-Q5_K_M.gguf) | Q5_K_M | 5 | 5.32 GB | large, very low quality loss - recommended |
|
12 |
| [Taiwan-inquiry_7B_v2.1-Q6_K.gguf](https://huggingface.co/ChenWeiLi/Taiwan-inquiry_7B_v2.1.gguf/blob/main/Taiwan-inquiry_7B_v2.1-Q6_K.gguf)| Q6_K | 6 | 6.14 GB| very large, extremely low quality loss |
|
13 |
| [Taiwan-inquiry_7B_v2.1-Q8_0.gguf](https://huggingface.co/ChenWeiLi/Taiwan-inquiry_7B_v2.1.gguf/blob/main/Taiwan-inquiry_7B_v2.1-Q8_0.gguf) | Q8_0 | 8 | 7.96 GB | very large, extremely low quality loss - not recommended |
|
14 |
-
| [Taiwan-inquiry_7B_v2.1.gguf ](https://huggingface.co/ChenWeiLi/Taiwan-inquiry_7B_v2.1.gguf/blob/main/
|
15 |
|
16 |
## Reference
|
17 |
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
|
|
|
11 |
| [Taiwan-inquiry_7B_v2.1-Q5_K_M.gguf](https://huggingface.co/ChenWeiLi/Taiwan-inquiry_7B_v2.1.gguf/blob/main/Taiwan-inquiry_7B_v2.1-Q5_K_M.gguf) | Q5_K_M | 5 | 5.32 GB | large, very low quality loss - recommended |
|
12 |
| [Taiwan-inquiry_7B_v2.1-Q6_K.gguf](https://huggingface.co/ChenWeiLi/Taiwan-inquiry_7B_v2.1.gguf/blob/main/Taiwan-inquiry_7B_v2.1-Q6_K.gguf)| Q6_K | 6 | 6.14 GB| very large, extremely low quality loss |
|
13 |
| [Taiwan-inquiry_7B_v2.1-Q8_0.gguf](https://huggingface.co/ChenWeiLi/Taiwan-inquiry_7B_v2.1.gguf/blob/main/Taiwan-inquiry_7B_v2.1-Q8_0.gguf) | Q8_0 | 8 | 7.96 GB | very large, extremely low quality loss - not recommended |
|
14 |
+
| [Taiwan-inquiry_7B_v2.1.gguf ](https://huggingface.co/ChenWeiLi/Taiwan-inquiry_7B_v2.1.gguf/blob/main/Taiwan-inquiry_7B_v2.1.gguf) | No quantization | 16 or 32 | 15 GB | very large, no quality loss - not recommended |
|
15 |
|
16 |
## Reference
|
17 |
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
|