File size: 2,566 Bytes
a4687d9 5e03ae1 592be10 5e03ae1 358f264 d176a04 a931760 854f469 830e0d6 5e03ae1 a4687d9 5e03ae1 a4687d9 5e03ae1 a4687d9 5e03ae1 a4687d9 5e03ae1 a4687d9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/Tn9MBg6.png" alt="MidnightMiqu" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
### Overview
This is a 103B frankenmerge of [sophosympatheia/Midnight-Miqu-70B-v1.5](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.5) with itself. Please see that model card for details and usage instructions.
This model is based on Miqu so it's capable of 32K context.
### Quantizations
* EXL2
* [2.5bpw-rpcal](https://huggingface.co/FluffyKaeloky/Midnight-Miqu-103B-v1.5-exl2-2.5bpw-rpcal)
* [3.0bpw-rpcal](https://huggingface.co/FluffyKaeloky/Midnight-Miqu-103B-v1.5-exl2-3.0bpw-rpcal)
* [3.5bpw-rpcal](https://huggingface.co/FluffyKaeloky/Midnight-Miqu-103B-v1.5-exl2-3.5bpw-rpcal)
* [4.0bpw-rpcal](https://huggingface.co/FluffyKaeloky/Midnight-Miqu-103B-v1.5-exl2-4.0bpw-rpcal)
* [5.0bpw-rpcal](https://huggingface.co/FluffyKaeloky/Midnight-Miqu-103B-v1.5-exl2-5.0bpw-rpcal)
### Licence and usage restrictions
<font color="red">152334H/miqu-1-70b-sf was based on a leaked version of one of Mistral's models.</font>
All miqu-derived models, including this merge, are **only suitable for personal use.** Mistral has been cool about it so far, but you should be aware that by downloading this merge you are assuming whatever legal risk is iherent in acquiring and using a model based on leaked weights.
This merge comes with no warranties or guarantees of any kind, but you probably already knew that.
I am not a lawyer and I do not profess to know what we have gotten ourselves into here. You should consult with a lawyer before using any Hugging Face model beyond private use... but definitely don't use this one for that!
## Merge Details
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [sophosympatheia/Midnight-Miqu-70B-v1.5](https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.5)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: sophosympatheiaMidnight-Miqu-70B-v1.5
layer_range: [0, 40] # 40
- sources:
- model: sophosympatheiaMidnight-Miqu-70B-v1.5
layer_range: [20, 60] # 40
- sources:
- model: sophosympatheiaMidnight-Miqu-70B-v1.5
layer_range: [40, 80] # 40
merge_method: passthrough
dtype: float16
```
|