Edit model card

Marcoro14-7B-slerp

Marcoro14-7B-slerp is a merge of the following models using mergekit:

🧩 Configuration

models:
  - model: unsloth/gemma-2b-bnb-4bit
    # no parameters necessary for base model
  - model: jiayihao03/gemma_2b_code_python_4bit
    parameters:
      density: 0.5
      weight: 0.3
merge_method: ties
base_model: unsloth/gemma-2b-bnb-4bit
parameters:
  normalize: true
dtype: float16
Downloads last month
4
Safetensors
Model size
1.52B params
Tensor type
FP16
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.