Update README.md
Browse files
README.md
CHANGED
@@ -23,7 +23,7 @@ So what's this new arcee_fusion merge method, and what can we do with it? This
|
|
23 |
|
24 |
I've seen strong prose from this model, which is natural considering its re-emphasis of Qwenvergence-14B-v12-Prose-DS. A full evaluation will be cued shortly.
|
25 |
|
26 |
-
This
|
27 |
|
28 |
### Configuration
|
29 |
|
|
|
23 |
|
24 |
I've seen strong prose from this model, which is natural considering its re-emphasis of Qwenvergence-14B-v12-Prose-DS. A full evaluation will be cued shortly.
|
25 |
|
26 |
+
This merge strategy is much simpler than a mainline Lamarck release, but that is necessary to see how multiple fusion merges behave. Where it fits for efforts towards a Lamarck v0.8 depends greatly on evaluation and feedback.
|
27 |
|
28 |
### Configuration
|
29 |
|