metadata
base_model:
- migtissera/Tess-3-Mistral-Large-2-123B
- TheDrummer/Behemoth-123B-v1
- SillyTilly/Mistral-Large-Instruct-2407
- NeverSleep/Lumimaid-v0.2-123B
- anthracite-org/magnum-v2-123b
library_name: transformers
tags:
- mergekit
- merge
ML-MS-Etheris-123B
![](https://cdn-uploads.huggingface.co/production/uploads/64545af5ec40bbbd01242ca6/ieEjL3TxpDM3WAZQcya6E.png)
Now the cute anime girl has your attention
Creator: SteelSkull
About Etheris-123B:
Name Legend:
ML = Mistral-Large
MS = Model Stock
123B = its 123B
This model merges the robust storytelling of mutiple models while attempting to maintain intelligence. The final model was merged after Model Soup with DELLA to add some specal sause.
Use Mistral, ChatML, or Meth Format
Quants:
Config:
MODEL_NAME = "ML-MS-Etheris-123B"
yaml_config = """
base_model: SillyTilly/Mistral-Large-Instruct-2407
merge_method: model_stock
dtype: bfloat16
models:
- model: NeverSleep/Lumimaid-v0.2-123B
- model: TheDrummer/Behemoth-123B-v1
- model: migtissera/Tess-3-Mistral-Large-2-123B
- model: anthracite-org/magnum-v2-123b
base_model: SillyTilly/Mistral-Large-Instruct-2407
merge_method: della
dtype: bfloat16
models:
- model: ./merge/msbase/Etheris-123B
model: ./merge/della/attempt3/model """
If you wish to support: