|
--- |
|
base_model: |
|
- ChaoticNeutrals/BuRP_7B |
|
- TeeZee/DarkSapling-7B-v1.1 |
|
- Endevor/EndlessRP-v2-7B |
|
- rmdhirr/Foxglove_7B |
|
- kainatq/KPT-7B |
|
- GlobalMeltdown/MaidenlessNoMore-7B |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
This is a merge of pre-trained language models |
|
|
|
Designed as part of ru capable 7B. |
|
|
|
RP, ERP, chat, it is good and fast. Sometimes hallucinate, sometimes writes excellent from first try. |
|
|
|
This one is more stable than v3 |
|
|
|
Of course, better try at least 12B with offloading, may be slower, but way "smarter" than any 7/8B. |
|
|
|
Tested on ChatML t1.01 |
|
|