2x7B AWQ
Collection
Mixture of experts 2 x 7B.
•
20 items
•
Updated
Mini-Mixtral-v0.2 is a Mixture of Experts (MoE) made with the following models using LazyMergekit:
Base model
NeuralNovel/Mini-Mixtral-v0.2