Yugo55-GPT-v4

datatab/Yugo55-GPT-v4 is a merge of the following models using LazyMergekit:

🧩 Configuration

models:
  - model: datatab/Serbian-Mistral-Orca-Slim-v1
    parameters:
      weight: 1.0
  - model: mlabonne/AlphaMonarch-7B
    parameters:
      weight: 1.0
  - model: datatab/YugoGPT-Alpaca-v1-epoch1-good
    parameters:
      weight: 1.0
merge_method: linear
dtype: float16

πŸ’» Usage

# TBD
Downloads last month
0
Safetensors
Model size
7.24B params
Tensor type
FP16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for datatab/Yugo55-GPT-v4

Finetuned
(7)
this model
Finetunes
1 model
Merges
2 models
Quantizations
1 model

Datasets used to train datatab/Yugo55-GPT-v4

Collection including datatab/Yugo55-GPT-v4