File size: 5,876 Bytes
417d98b 8f8abaf 417d98b c9f4bb4 417d98b 8ad45c8 417d98b 7a34c55 dcd2492 417d98b df623d5 7a34c55 417d98b 68b0753 417d98b 94ade17 3d59dc2 94ade17 3d59dc2 94ade17 3d59dc2 94ade17 3d59dc2 94d1c5c c9f4bb4 94d1c5c 94ade17 417d98b 0a202bb 68b0753 0a202bb 68b0753 b2eb597 8f8abaf baf3880 417d98b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 |
---
base_model: wolfram/miquliz-120b-v2.0
language:
- en
- de
- fr
- es
- it
library_name: transformers
license: other
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
static quants of https://huggingface.co/wolfram/miquliz-120b-v2.0
While other static and imatrix quants are available already, I wanted a wider selection of quants available for this model.
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/miquliz-120b-v2.0-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q2_K.gguf) | Q2_K | 44.6 | |
| [GGUF](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ3_XS.gguf) | IQ3_XS | 49.3 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q3_K_XS.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q3_K_XS.gguf.split-ab) | Q3_K_XS | 49.3 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ3_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ3_S.gguf.part2of2) | IQ3_S | 52.1 | beats Q3_K* |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q3_K_S.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q3_K_S.gguf.split-ab) | Q3_K_S | 52.2 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ3_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ3_M.gguf.part2of2) | IQ3_M | 53.8 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q3_K_M.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q3_K_M.gguf.split-ab) | Q3_K_M | 58.2 | lower quality |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q3_K_L.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q3_K_L.gguf.split-ab) | Q3_K_L | 63.4 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ4_XS.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ4_XS.gguf.part2of2) | IQ4_XS | 64.9 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q4_K_S.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q4_K_S.gguf.split-ab) | Q4_K_S | 68.7 | fast, recommended |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ4_NL.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.IQ4_NL.gguf.split-ab) | IQ4_NL | 68.8 | prefer IQ4_XS |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q4_K_M.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q4_K_M.gguf.split-ab) | Q4_K_M | 72.6 | fast, recommended |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q5_K_S.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q5_K_S.gguf.split-ab) | Q5_K_S | 83.2 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q5_K_M.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q5_K_M.gguf.split-ab) | Q5_K_M | 85.4 | |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q6_K.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q6_K.gguf.split-ab) [PART 3](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q6_K.gguf.split-ac) | Q6_K | 99.1 | very good quality |
| [PART 1](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q8_0.gguf.split-aa) [PART 2](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q8_0.gguf.split-ab) [PART 3](https://huggingface.co/mradermacher/miquliz-120b-v2.0-GGUF/resolve/main/miquliz-120b-v2.0.Q8_0.gguf.split-ac) | Q8_0 | 128.2 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|