FusionNet

Fine-tuned model on English language using MoE method.

Model description

The FusionNet is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The FusionNet has 12.9B parameters, and this model is fine-tuned. Enjoy!

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 75.91
AI2 Reasoning Challenge (25-Shot) 73.55
HellaSwag (10-Shot) 88.84
MMLU (5-Shot) 64.68
TruthfulQA (0-shot) 69.60
Winogrande (5-shot) 88.16
GSM8k (5-shot) 70.66
Downloads last month
898
Safetensors
Model size
12.9B params
Tensor type
BF16
ยท
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for TomGrc/FusionNet_7Bx2_MoE_14B

Merges
4 models
Quantizations
5 models

Space using TomGrc/FusionNet_7Bx2_MoE_14B 1

Evaluation results