VitalContribution
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -12,7 +12,7 @@ library_name: adapter-transformers
|
|
12 |
|
13 |
I was just curious to see if something special might happen if one uses:
|
14 |
$$
|
15 |
-
\text{{
|
16 |
$$
|
17 |
|
18 |
The underlying model that I used was `/Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp`.
|
|
|
12 |
|
13 |
I was just curious to see if something special might happen if one uses:
|
14 |
$$
|
15 |
+
\text{{high-quality DPO dataset}} + \text{{merge of DPO optimized and non-DPO optimized model}}
|
16 |
$$
|
17 |
|
18 |
The underlying model that I used was `/Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp`.
|