Fett-uccine

This model is created by training Mistral base model on LimaRP (ShareGPT format provided by SAO), theory of mind, and gnosis(provided by jeiku).

The 8-bit lora was then merged into Mistral Instruct resulting in what you see here.

Works best with ChatML Instruct

This model is in honor of the SillyTavern community, keep being awesome!

Downloads last month
2
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Datasets used to train LoneStriker/Fett-uccine-7B-3.0bpw-h6-exl2