Zinakha-12b / README.md
abhiAI777's picture
Update README.md
f680bdc verified
|
raw
history blame
1.16 kB
metadata
library_name: transformers
license: apache-2.0
base_model:
  - mistralai/Mistral-Nemo-Base-2407

Zinakha-12b Banner

Zinakha-12b πŸ§™β€β™‚οΈ

Zinakha 12b tries to become the perfect companion for any chat which involves multiple roles. The ability to understand context is pretty awesome and excels in creativity and storytelling. It is built on Nemo 12b and trained on different datasets as well as some layer merges to ehance its capabilities.

Model Details πŸ“Š

Quantization

Model Architecture πŸ—οΈ

  • Base model: mistralai/Mistral-Nemo-Base-2407
  • Parameter count: ~12 billion
  • Architecture specifics: Transformer-based language model