Nekochu's picture
Add RoleBeagle-Moistral-11B-v2-8.0bpw-h8-exl2
bd4b25b verified
|
raw
history blame
947 Bytes
metadata
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge

output-model-directory

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

  • ./models/Moistral-11B-v2
  • ./models/RoleBeagle-11B

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: ./models/Moistral-11B-v2
  - model: ./models/RoleBeagle-11B
merge_method: slerp
base_model: ./models/Moistral-11B-v2
parameters:
  t:
    - filter: self_attn
      value: [0, 0.7, 0.5, 0.8, 1]  # Increased influence of Moistral model
    - filter: mlp
      value: [1, 0.3, 0.5, 0.2, 0]  # Decreased influence of RoleBeagle model
    - value: 0.7  # Increased fallback value for rest of tensors
dtype: float16