File size: 288 Bytes
b01be7a |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
slices:
- sources:
- model: models/perky-70b-v0.1
layer_range: [0, 30]
- sources:
- model: models/perky-70b-v0.1
layer_range: [10, 70]
- sources:
- model: models/perky-70b-v0.1
layer_range: [50, 80]
merge_method: passthrough
dtype: float16
|