Contributors: Nicolas Mejia Petit

License

Mistral 29b: A New Base Model

The objective of this model is to serve as a new fully open source base model with 29.2 billion parameters.

This model spits out jargon, and needs to be fine tuned, either with qlora, with the adapter attached to every layer, or better yet a full fine tune.

Model Creation

The model was created by stacking four models: Dolphin, Zephyr, Meta-math7b, and Speechless code, to form a single model.

Useful Resources

Source Models

Downloads last month
18
Safetensors
Model size
28.2B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.