--- language: - en - fr - de - es - it - pt - zh - ja - ru - ko license: apache-2.0 base_model: - mistralai/Mistral-Small-24B-Instruct-2501 base_model_relation: quantized library_name: mlc-llm pipeline_tag: text-generation --- 4-bit [OmniQuant](https://arxiv.org/abs/2308.13137) quantized version of [mistralai/Mistral-Small-24B-Instruct-2501](https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501) for inference with the [Private LLM](https://privatellm.app) app.