legraphista
commited on
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -72,7 +72,7 @@ Link: [here](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b
|
|
72 |
| [internlm2-math-plus-mixtral8x22b.Q2_K_S/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.Q2_K_S) | Q2_K_S | 48.09GB | β
Available | π’ IMatrix | β Yes
|
73 |
| [internlm2-math-plus-mixtral8x22b.IQ4_NL/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ4_NL) | IQ4_NL | 79.78GB | β
Available | π’ IMatrix | β Yes
|
74 |
| [internlm2-math-plus-mixtral8x22b.IQ4_XS/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ4_XS) | IQ4_XS | 75.48GB | β
Available | π’ IMatrix | β Yes
|
75 |
-
| internlm2-math-plus-mixtral8x22b.IQ3_M | IQ3_M |
|
76 |
| internlm2-math-plus-mixtral8x22b.IQ3_S | IQ3_S | - | β³ Processing | π’ IMatrix | -
|
77 |
| internlm2-math-plus-mixtral8x22b.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
78 |
| internlm2-math-plus-mixtral8x22b.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
|
|
72 |
| [internlm2-math-plus-mixtral8x22b.Q2_K_S/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.Q2_K_S) | Q2_K_S | 48.09GB | β
Available | π’ IMatrix | β Yes
|
73 |
| [internlm2-math-plus-mixtral8x22b.IQ4_NL/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ4_NL) | IQ4_NL | 79.78GB | β
Available | π’ IMatrix | β Yes
|
74 |
| [internlm2-math-plus-mixtral8x22b.IQ4_XS/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ4_XS) | IQ4_XS | 75.48GB | β
Available | π’ IMatrix | β Yes
|
75 |
+
| [internlm2-math-plus-mixtral8x22b.IQ3_M/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ3_M) | IQ3_M | 64.50GB | β
Available | π’ IMatrix | β Yes
|
76 |
| internlm2-math-plus-mixtral8x22b.IQ3_S | IQ3_S | - | β³ Processing | π’ IMatrix | -
|
77 |
| internlm2-math-plus-mixtral8x22b.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
78 |
| internlm2-math-plus-mixtral8x22b.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|