Misted-Toppy-11B

Thanks to mradermacher for the quants! You can find them here: mradermacher/Misted-Toppy-11B-GGUF

My first model ever. Roast me.

It's a frankenmerge between Walmart-the-bag/Misted-v2-7B and Undi95/Toppy-M-7B. Seems to work alright. Tested for like 2 seconds. Kinda dumb tho. Decent for RP.

It's also censored AF. Bruh

Anyway, have a nice day!

Downloads last month
1
Safetensors
Model size
10.7B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for matchaaaaa/Misted-Toppy-11B

Quantizations
1 model