Silicon-Alice-7B

Silicon-Alice-7B

What's that?

Silicon-Alice-7B is a model based on Silicon-Masha-7B aiming to be both strong in RP, be smart and understand Russian, that can follow character maps very well. This model understands Russian better than the previous one. It is suitable for RP/ERP and general use.

Prompt Template (Alpaca)

I found the best SillyTavern results from using the Noromaid template but please try other templates! Let me know if you find anything good.

SillyTavern config files: Context, Instruct.

Additionally, here is my highly recommended Text Completion preset. You can tweak this by adjusting temperature up or dropping min p to boost creativity or raise min p to increase stability. You shouldn't need to touch anything else!

Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Response:
Downloads last month
128
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for LakoMoor/Silicon-Alice-7B

Finetuned
(1)
this model
Merges
2 models
Quantizations
7 models