sometimesanotion's picture
Update README.md
f2413f4 verified
|
raw
history blame
1.58 kB
metadata
base_model:
  - sometimesanotion/Lamarck-14B-v0.7
  - sometimesanotion/Qwenvergence-14B-v12-Prose-DS
  - jpacifico/Chocolatine-2-14B-Instruct-v2.0.3
  - suayptalha/Lamarckvergence-14B
library_name: transformers
tags:
  - mergekit
  - merge
license: apache-2.0
language:
  - en

EXPERIMENTAL:

So what's this new arcee_fusion merge method, and what can we do with it? This model aims to find out, as a multi-stage merge where 3 out of 4 steps are fusions:

I've seen strong prose from this model, which is natural considering its re-emphasis of Qwenvergence-14B-v12-Prose-DS. A full evaluation will be cued shortly.

This is actually a lot simpler than a mainline Lamarck release, and where it fits for efforts towards a Lamarck v0.8 depends greatly on evaluation and feedback.