athirdpath
commited on
Commit
•
a043c6b
1
Parent(s):
64a23ac
Update README.md
Browse files
README.md
CHANGED
@@ -9,6 +9,8 @@ Eileithyia-13B is an unaligned, roleplay oriented model created by merging [Kobo
|
|
9 |
|
10 |
Eileithyia, as is the current trend, is named after a Greek goddess; in this case it is the goddess of childbirth and pregnancy.
|
11 |
|
|
|
|
|
12 |
The private ~400k token dataset used to train the LORA was Alpaca formatted and focused on 4 primary categories:
|
13 |
|
14 |
- Medical texts (on pregnancy, reproductive organs, and impregnation). These are formatted so the model, in character as a doctor, answers a patient's question in short to medium form.
|
|
|
9 |
|
10 |
Eileithyia, as is the current trend, is named after a Greek goddess; in this case it is the goddess of childbirth and pregnancy.
|
11 |
|
12 |
+
![image/png](https://cdn-lfs-us-1.huggingface.co/repos/d4/88/d488ea799b83cecc04b4e30cb3b94d0b526752ded5639e34d5825393cdfccce2/d94a938f05187554bd48bcff454737b4980d3c15ef39cff144d0dd3c5686fcb7?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27f57c7d76-2494-490b-9edc-bedbda02c112.png%3B+filename%3D%22f57c7d76-2494-490b-9edc-bedbda02c112.png%22%3B&response-content-type=image%2Fpng&Expires=1700115010&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcwMDExNTAxMH19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmh1Z2dpbmdmYWNlLmNvL3JlcG9zL2Q0Lzg4L2Q0ODhlYTc5OWI4M2NlY2MwNGI0ZTMwY2IzYjk0ZDBiNTI2NzUyZGVkNTYzOWUzNGQ1ODI1MzkzY2RmY2NjZTIvZDk0YTkzOGYwNTE4NzU1NGJkNDhiY2ZmNDU0NzM3YjQ5ODBkM2MxNWVmMzljZmYxNDRkMGRkM2M1Njg2ZmNiNz9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSomcmVzcG9uc2UtY29udGVudC10eXBlPSoifV19&Signature=fcKuHcC9UkVe5Lg%7EamhfIjidH4oBkqMP-zZzFTikTQJmbBvgA88p9zwrYMUm%7EBnesvSzaesc2CaPv8dxJM9cxp1E-e8MIMHn52gy1Yad4wSzFA5SNFT4WJPa024fTrKa3lmOUm0KxLC9BKx9slMZWlFea%7E8Gm%7EMxW67R%7E-KPsZ1fBfRSOBl1ZXCBy3gF1RE0tFTlYMJmEgLH3chQlW37WjKBJmCyRTw4-ekg23fbdFHlq%7EWDWhBYUUvCvY0ypcqvft2I80RMJUaG1Hkf3sVxCOwmwIfZNrXH0oRilSjFtbV573IwETTD5lko6bTVyNjeYr6sG4I5Le1nOdPRtpJOWw__&Key-Pair-Id=KCD77M1F0VK2B)
|
13 |
+
|
14 |
The private ~400k token dataset used to train the LORA was Alpaca formatted and focused on 4 primary categories:
|
15 |
|
16 |
- Medical texts (on pregnancy, reproductive organs, and impregnation). These are formatted so the model, in character as a doctor, answers a patient's question in short to medium form.
|