Marcoro14-7B-dare / README.md
Gweizheng's picture
Upload folder using huggingface_hub
46859d7 verified
|
raw
history blame
No virus
1.06 kB
metadata
license: apache-2.0
tags:
  - merge
  - mergekit
  - lazymergekit
  - samir-fama/SamirGPT-v1
  - abacusai/Slerp-CM-mist-dpo
  - EmbeddedLLM/Mistral-7B-Merge-14-v0.2

Marcoro14-7B-dare

Marcoro14-7B-dare is a merge of the following models using mergekit:

🧩 Configuration

```yaml models:

  • model: mistralai/Mistral-7B-v0.1

    No parameters necessary for base model

  • model: samir-fama/SamirGPT-v1 parameters: density: 0.53 weight: 0.4
  • model: abacusai/Slerp-CM-mist-dpo parameters: density: 0.53 weight: 0.3
  • model: EmbeddedLLM/Mistral-7B-Merge-14-v0.2 parameters: density: 0.53 weight: 0.3 merge_method: dare_ties base_model: mistralai/Mistral-7B-v0.1 parameters: int8_mask: true dtype: bfloat16 ```