mention type of quant, very minor but might save someone a click
Browse files
README.md
CHANGED
@@ -95,7 +95,7 @@ Well, yes, but actually no. You may see the names of benchmarks in the datasets
|
|
95 |
|
96 |
## Quants and Other Formats
|
97 |
- GGUFs:
|
98 |
-
* Official: [https://huggingface.co/darkcloudai/huskylm-2.5-8b-GGUF](https://huggingface.co/darkcloudai/huskylm-2.5-8b-GGUF)
|
99 |
* mradermacher's static quants (thank you!): [https://huggingface.co/mradermacher/huskylm-2.5-8b-GGUF](https://huggingface.co/mradermacher/huskylm-2.5-8b-GGUF)
|
100 |
* mradermacher's imatrix quants (thank you!): [https://huggingface.co/mradermacher/huskylm-2.5-8b-i1-GGUF](https://huggingface.co/mradermacher/huskylm-2.5-8b-i1-GGUF)
|
101 |
- Official AWQ (bits: 4, gs: 128, version: gemm): [https://huggingface.co/darkcloudai/huskylm-2.5-8b-AWQ](https://huggingface.co/darkcloudai/huskylm-2.5-8b-AWQ)
|
|
|
95 |
|
96 |
## Quants and Other Formats
|
97 |
- GGUFs:
|
98 |
+
* Official (static): [https://huggingface.co/darkcloudai/huskylm-2.5-8b-GGUF](https://huggingface.co/darkcloudai/huskylm-2.5-8b-GGUF)
|
99 |
* mradermacher's static quants (thank you!): [https://huggingface.co/mradermacher/huskylm-2.5-8b-GGUF](https://huggingface.co/mradermacher/huskylm-2.5-8b-GGUF)
|
100 |
* mradermacher's imatrix quants (thank you!): [https://huggingface.co/mradermacher/huskylm-2.5-8b-i1-GGUF](https://huggingface.co/mradermacher/huskylm-2.5-8b-i1-GGUF)
|
101 |
- Official AWQ (bits: 4, gs: 128, version: gemm): [https://huggingface.co/darkcloudai/huskylm-2.5-8b-AWQ](https://huggingface.co/darkcloudai/huskylm-2.5-8b-AWQ)
|