File size: 229 Bytes
d5ab617 |
1 2 3 4 5 6 7 |
base_model: sometimesanotion/Lamarck-14B-v0.7+sometimesanotion/LoRA-256-Base-Qwenvergence
dtype: float16
merge_method: passthrough
models:
- model: sometimesanotion/Lamarck-14B-v0.7+sometimesanotion/LoRA-256-Base-Qwenvergence |