Edit model card

NemoMix-12B-DellaV1b

NNemoMix-12B-DellaV1b is an experimental merge of the following models using the della_linear method using mergekit:

This merge works well, but is very horny. Extremely NSFW.

🧩 Configuration

models:
  - model: BeaverAI/mistral-doryV2-12b
    parameters:
      weight: 0.30
      density: 0.42
  - model: intervitens/mini-magnum-12b-v1.1
    parameters:
      weight: 0.35
      density: 0.66
  - model: grimjim/mistralai-Mistral-Nemo-Base-2407
    parameters:
      weight: 0.35
      density: 0.78
merge_method: della_linear
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
parameters:
  int8_mask: true
  normalize: true
  epsilon: 0.1  
  lambda: 1.0   
  density: 0.7
dtype: bfloat16
Downloads last month
4
Safetensors
Model size
12.2B params
Tensor type
BF16
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for jsfs11/NemoMix-12B-DellaV1b

Quantizations
1 model