---
library_name: transformers
tags:
- '#mergekit '
- '#arcee-ai'
datasets:
- arcee-ai/sec-data-mini
base_model: arcee-ai/Mistral-7B-Instruct-v0.2-sliced-24-layer
model_creator: arcee-ai
model_name: Mistral-7B-Instruct-v0.2-sliced-24-layer
quantized_by: JanHQ
---
Jan
- Discord
# Model Description
This is a GGUF version of [arcee-ai/Mistral-7B-Instruct-v0.2-sliced-24-layer](https://huggingface.co/arcee-ai/Mistral-7B-Instruct-v0.2-sliced-24-layer)
- Model creator: [arcee-ai](https://huggingface.co/arcee-ai)
- Original model: [Mistral-7B-Instruct-v0.2-sliced-24-layer](https://huggingface.co/arcee-ai/Mistral-7B-Instruct-v0.2-sliced-24-layer)
- Model description: [Readme](https://huggingface.co/arcee-ai/Mistral-7B-Instruct-v0.2-sliced-24-layer/blob/main/README.md)
# About Jan
Jan believes in the need for an open-source AI ecosystem and is building the infra and tooling to allow open-source AIs to compete on a level playing field with proprietary ones.
Jan's long-term vision is to build a cognitive framework for future robots, who are practical, useful assistants for humans and businesses in everyday life.
# Jan Model Converter
This is a repository for the [open-source converter](https://github.com/janhq/model-converter. We would be grateful if the community could contribute and strengthen this repository. We are aiming to expand the repo that can convert into various types of format