Transformers
GGUF
English
Inference Endpoints
imatrix
conversational
mradermacher commited on
Commit
d295f89
·
verified ·
1 Parent(s): 01a2b70

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -0
README.md CHANGED
@@ -34,9 +34,31 @@ more details, including on how to concatenate multi-part files.
34
 
35
  | Link | Type | Size/GB | Notes |
36
  |:-----|:-----|--------:|:------|
 
 
 
 
 
 
 
37
  | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q2_K.gguf) | i1-Q2_K | 2.6 | IQ3_XXS probably better |
 
 
 
 
38
  | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ3_M.gguf) | i1-IQ3_M | 3.2 | |
 
 
 
 
 
 
 
39
  | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.0 | optimal size/speed/quality |
 
 
 
 
40
 
41
  Here is a handy graph by ikawrakow comparing some lower-quality quant
42
  types (lower is better):
 
34
 
35
  | Link | Type | Size/GB | Notes |
36
  |:-----|:-----|--------:|:------|
37
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ1_S.gguf) | i1-IQ1_S | 1.6 | for the desperate |
38
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ1_M.gguf) | i1-IQ1_M | 1.8 | mostly desperate |
39
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.0 | |
40
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.1 | |
41
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ2_S.gguf) | i1-IQ2_S | 2.3 | |
42
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q2_K_S.gguf) | i1-Q2_K_S | 2.4 | very low quality |
43
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ2_M.gguf) | i1-IQ2_M | 2.5 | |
44
  | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q2_K.gguf) | i1-Q2_K | 2.6 | IQ3_XXS probably better |
45
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 2.7 | lower quality |
46
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ3_XS.gguf) | i1-IQ3_XS | 2.9 | |
47
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ3_S.gguf) | i1-IQ3_S | 3.0 | beats Q3_K* |
48
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.0 | IQ3_XS probably better |
49
  | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ3_M.gguf) | i1-IQ3_M | 3.2 | |
50
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.4 | IQ3_S probably better |
51
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q3_K_L.gguf) | i1-Q3_K_L | 3.7 | IQ3_M probably better |
52
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-IQ4_XS.gguf) | i1-IQ4_XS | 3.7 | |
53
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 3.9 | fast on arm, low quality |
54
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 3.9 | fast on arm+i8mm, low quality |
55
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 3.9 | fast on arm+sve, low quality |
56
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q4_0.gguf) | i1-Q4_0 | 3.9 | fast, low quality |
57
  | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.0 | optimal size/speed/quality |
58
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.2 | fast, recommended |
59
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q5_K_S.gguf) | i1-Q5_K_S | 4.8 | |
60
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q5_K_M.gguf) | i1-Q5_K_M | 4.9 | |
61
+ | [GGUF](https://huggingface.co/mradermacher/tulu-2-dpo-7b-i1-GGUF/resolve/main/tulu-2-dpo-7b.i1-Q6_K.gguf) | i1-Q6_K | 5.6 | practically like static Q6_K |
62
 
63
  Here is a handy graph by ikawrakow comparing some lower-quality quant
64
  types (lower is better):