sometimesanotion
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -48,7 +48,7 @@ This model was made in two branches: a della_linear merge, and a sequence of mo
|
|
48 |
|
49 |
### Configuration
|
50 |
|
51 |
-
This model was made
|
52 |
|
53 |
```yaml
|
54 |
name: Lamarck-14B-v0.6-rc4
|
|
|
48 |
|
49 |
### Configuration
|
50 |
|
51 |
+
This model was made with two branches, diverged and recombined. The first branch was a Vimarckoso v3-based della_linear merge, and the second, a sequence of model_stock and then breadcrumbs+LoRA. The LoRAs required minor adjustments to most component models for intercompatibility. The breadcrumbs and della merges required highly focused layer-specific gradients to effectively combine the models. This was my most complex merge to date, and its final step was the SLERP-merge below.
|
52 |
|
53 |
```yaml
|
54 |
name: Lamarck-14B-v0.6-rc4
|