legraphista
commited on
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -76,7 +76,7 @@ Link: [here](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b
|
|
76 |
| [internlm2-math-plus-mixtral8x22b.IQ3_S/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ3_S) | IQ3_S | 61.50GB | β
Available | π’ IMatrix | β Yes
|
77 |
| [internlm2-math-plus-mixtral8x22b.IQ3_XS/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ3_XS) | IQ3_XS | 58.23GB | β
Available | π’ IMatrix | β Yes
|
78 |
| [internlm2-math-plus-mixtral8x22b.IQ3_XXS/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ3_XXS) | IQ3_XXS | 54.91GB | β
Available | π’ IMatrix | β Yes
|
79 |
-
| internlm2-math-plus-mixtral8x22b.IQ2_M | IQ2_M |
|
80 |
| internlm2-math-plus-mixtral8x22b.IQ2_S | IQ2_S | - | β³ Processing | π’ IMatrix | -
|
81 |
| internlm2-math-plus-mixtral8x22b.IQ2_XS | IQ2_XS | - | β³ Processing | π’ IMatrix | -
|
82 |
| internlm2-math-plus-mixtral8x22b.IQ2_XXS | IQ2_XXS | - | β³ Processing | π’ IMatrix | -
|
|
|
76 |
| [internlm2-math-plus-mixtral8x22b.IQ3_S/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ3_S) | IQ3_S | 61.50GB | β
Available | π’ IMatrix | β Yes
|
77 |
| [internlm2-math-plus-mixtral8x22b.IQ3_XS/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ3_XS) | IQ3_XS | 58.23GB | β
Available | π’ IMatrix | β Yes
|
78 |
| [internlm2-math-plus-mixtral8x22b.IQ3_XXS/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ3_XXS) | IQ3_XXS | 54.91GB | β
Available | π’ IMatrix | β Yes
|
79 |
+
| [internlm2-math-plus-mixtral8x22b.IQ2_M/*](https://huggingface.co/legraphista/internlm2-math-plus-mixtral8x22b-IMat-GGUF/tree/main/internlm2-math-plus-mixtral8x22b.IQ2_M) | IQ2_M | 46.71GB | β
Available | π’ IMatrix | β Yes
|
80 |
| internlm2-math-plus-mixtral8x22b.IQ2_S | IQ2_S | - | β³ Processing | π’ IMatrix | -
|
81 |
| internlm2-math-plus-mixtral8x22b.IQ2_XS | IQ2_XS | - | β³ Processing | π’ IMatrix | -
|
82 |
| internlm2-math-plus-mixtral8x22b.IQ2_XXS | IQ2_XXS | - | β³ Processing | π’ IMatrix | -
|