YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
SyntheticMoist-11B-v2 - GGUF
- Model creator: https://huggingface.co/v000000/
- Original model: https://huggingface.co/v000000/SyntheticMoist-11B-v2/
Original model description:
base_model: - Sao10K/Fimbulvetr-11B-v2 - TheDrummer/Moistral-11B-v3 - Himitsui/MedMitsu-Instruct-11B - Himitsui/Kaiju-11B - migtissera/Synthia-v3.0-11B - jeiku/Re-Host_Limarp_Mistral library_name: transformers tags: - mergekit - merge - solar - llama - not-for-all-audiences
SyntheticMoist-v2
RP Model, Solar. Higher density+LimaRP led to better performance, Use Alpaca/Vicuna.
Thanks mradermacher for the quants!
Quants
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using Sao10K/Fimbulvetr-11B-v2 as a base.
Models Merged
The following models were included in the merge:
- TheDrummer/Moistral-11B-v3
- Himitsui/MedMitsu-Instruct-11B
- Himitsui/Kaiju-11B
- migtissera/Synthia-v3.0-11B + jeiku/Re-Host_Limarp_Mistral
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Himitsui/MedMitsu-Instruct-11B
parameters:
weight: 0.13
density: 0.60
- model: Himitsui/Kaiju-11B
parameters:
weight: 0.22
density: 0.73
- model: migtissera/Synthia-v3.0-11B+jeiku/Re-Host_Limarp_Mistral
parameters:
weight: 0.28
density: 0.80
- model: TheDrummer/Moistral-11B-v3
parameters:
weight: 0.37
density: 0.85
merge_method: dare_ties
base_model: Sao10K/Fimbulvetr-11B-v2
parameters:
int8_mask: true
dtype: bfloat16
- Downloads last month
- 135