MN-12B-Mag-Mell-R1 / README.md
inflatebot's picture
Upload folder using huggingface_hub
9405f8c verified
|
raw
history blame
1.39 kB
metadata
base_model:
  - IntervitensInc/Mistral-Nemo-Base-2407-chatml
library_name: transformers
tags:
  - mergekit
  - merge

magmell-r1g

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using IntervitensInc/Mistral-Nemo-Base-2407-chatml as a base.

Models Merged

The following models were included in the merge:

  • /workspace/models/magmell-r1/monk
  • /workspace/models/magmell-r1/hero
  • /workspace/models/magmell-r1/deity

Configuration

The following YAML configuration was used to produce this model:

base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
merge_method: dare_ties
slices:
- sources:
  - layer_range: [0, 40]
    model: /workspace/models/magmell-r1/monk
    parameters:
      density: 0.7
      weight: 0.5
  - layer_range: [0, 40]
    model: /workspace/models/magmell-r1/hero
    parameters:
      density: 0.9
      weight: 1.0
  - layer_range: [0, 40]
    model: /workspace/models/magmell-r1/deity
    parameters:
      density: 0.5
      weight: 0.7
  - layer_range: [0, 40]
    model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
tokenizer_source: base