moecule-1.1b-m2
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using TinyLlama-1.1B-1T-OpenOrca as a base.
Models Merged
The following models were included in the merge:
- tiny-llama-1.1b-chat-medical
- TinyLlama-1.1B-intermediate-step-1195k-token-2.5T
Configuration
The following YAML configuration was used to produce this model:
models:
- model: tiny-llama-1.1b-chat-medical
parameters:
density: [1, 0.7, 0.1] # density gradient
weight: 1.0
- model: TinyLlama-1.1B-intermediate-step-1195k-token-2.5T
parameters:
density: 0.5
weight: [0, 0.3, 0.7, 1] # weight gradient
merge_method: ties
base_model: TinyLlama-1.1B-1T-OpenOrca
parameters:
normalize: true
int8_mask: true
dtype: bfloat16
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.