MixtureofMerges-MoE-2x7b-v0.01b-DELLA

MixtureofMerges-MoE-2x7b-v0.01b-DELLA is a merge of the following models using mergekit:

🧩 Configuration

models:
  - model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
    parameters:
      weight: 1
  - model: jsfs11/MixtureofMerges-MoE-2x7b-v6
    parameters:
      weight: 1.0
merge_method: della
base_model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
parameters:
  density: 0.6 
  epsilon: 0.2  
  lambda: 1.0  
dtype: bfloat16
Downloads last month
5
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Model tree for jsfs11/MixtureofMerges-MoE-2x7b-v0.01b-DELLA

Quantizations
3 models