metadata
license: cc-by-nc-4.0
tags:
- moe
- frankenmoe
- merge
- mergekit
- lazymergekit
- autoquant
- gguf
base_model:
- mlabonne/AlphaMonarch-7B
- beowolx/CodeNinja-1.0-OpenChat-7B
- SanjiWatsuki/Kunoichi-DPO-v2-7B
- mlabonne/NeuralDaredevil-7B
Beyonder-4x7B-v3 is a Mixture of Experts (MoE) made with the following models using LazyMergekit: