File size: 6,698 Bytes
1511173 e6c76b3 1511173 f2e453b 1511173 6b80065 1511173 24bc39f 1511173 24bc39f 7d471cb 24bc39f 1511173 b820a1a 1511173 7d471cb f2e453b 2b7cd97 47ebc40 ef932fb 2b7cd97 c922b1a 2b7cd97 ef932fb 2b7cd97 2fa535e e57f6c7 2b7cd97 ef932fb 085b0e3 dd1c3d7 c922b1a 113ab80 dd1c3d7 08f4208 1511173 2b69d13 b820a1a 2b69d13 b820a1a 9a68d2e e6c76b3 6ef3863 1511173 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 |
---
base_model: wolfram/miquliz-120b-v2.0
language:
- en
- de
- fr
- es
- it
library_name: transformers
license: other
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
weighted/imatrix quants of https://huggingface.co/wolfram/miquliz-120b-v2.0
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ1_S.gguf) | i1-IQ1_S | 25.7 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ1_M.gguf) | i1-IQ1_M | 27.8 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 32.2 | |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ2_XS.gguf) | i1-IQ2_XS | 35.8 | |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ2_S.gguf) | i1-IQ2_S | 37.6 | |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ2_M.gguf) | i1-IQ2_M | 40.9 | |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q2_K.gguf) | i1-Q2_K | 44.6 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 47.3 | lower quality |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q3_K_XS.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q3_K_XS.gguf.split-ab) | i1-Q3_K_XS | 49.3 | |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ3_XS.gguf) | i1-IQ3_XS | 49.4 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q3_K_S.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q3_K_S.gguf.split-ab) | i1-Q3_K_S | 52.2 | IQ3_XS probably better |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ3_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ3_S.gguf.part2of2) | i1-IQ3_S | 52.4 | beats Q3_K* |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ3_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ3_M.gguf.part2of2) | i1-IQ3_M | 54.2 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q3_K_M.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q3_K_M.gguf.split-ab) | i1-Q3_K_M | 58.2 | IQ3_S probably better |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q3_K_L.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q3_K_L.gguf.split-ab) | i1-Q3_K_L | 63.4 | IQ3_M probably better |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ4_XS.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-IQ4_XS.gguf.part2of2) | i1-IQ4_XS | 64.6 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q4_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q4_0.gguf.part2of2) | i1-Q4_0 | 68.1 | fast, low quality |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q4_K_S.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q4_K_S.gguf.split-ab) | i1-Q4_K_S | 68.7 | optimal size/speed/quality |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q4_K_M.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q4_K_M.gguf.split-ab) | i1-Q4_K_M | 72.6 | fast, recommended |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q5_K_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q5_K_S.gguf.part2of2) | i1-Q5_K_S | 83.2 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q5_K_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q5_K_M.gguf.part2of2) | i1-Q5_K_M | 85.4 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q6_K.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q6_K.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF/resolve/main/miquliz-120b-v2.0.i1-Q6_K.gguf.part3of3) | i1-Q6_K | 99.1 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|