Not-For-All-Audiences
File size: 1,744 Bytes
0ed4af0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
tags:
- not-for-all-audiences
license: cc-by-nc-4.0
---
GGUF quants of Eileithyia-20b.
Eileithyia-20b is an uncensored, roleplay oriented model created by training a 128 rank [bespoke LORA](https://huggingface.co/athirdpath/Eileithyia-20b-LORA) on top of [athirdpath/Harmonia-20B](https://huggingface.co/athirdpath/Harmonia-20B).
Eileithyia, as is the current trend, is named after a Greek goddess; in this case it is the goddess of childbirth and pregnancy.
![image/png](https://i.ibb.co/zR1CX4G/ele.png)
The private ~400k token dataset used to train the LORA was Alpaca formatted and focused on 4 primary categories:
- Medical texts (on pregnancy, reproductive organs, and impregnation). These are formatted so the model, in character as a doctor, answers a patient's question in short to medium form.
- Excerpts from short stories and novellas (erotic, romantic, and platonic) centered around both realistic and fantastic pregnancy. These are sliced into ~2048 token chunks, and these long-form responses are all tied to the command “Enter narrator mode.” in the instructions.
- A selection from [PIPPA](https://huggingface.co/datasets/PygmalionAI/PIPPA), using a wide keyword search for related terms then human curated (...the things I’ve seen…). These are converted to Alpaca with “Enter RP mode.” in all the instruction fields.
- ~42k tokens of GPT-4 generated data on pregnancy from various characters’ perspectives, focusing on different responses and stages. Also includes a synopsis for each week in various styles.
- ~18k tokens of GPT-4 generated data on non-maternal role-playing from various characters’ perspectives, focusing on different situations and emotions. Includes many multi-turn conversations. |