<!-- ### quantize_version: 2 --> | |
<!-- ### output_tensor_quantised: 1 --> | |
<!-- ### convert_type: hf --> | |
<!-- ### vocab_type: --> | |
static quants of https://huggingface.co/frost19k/dolphin-2.8-mistral-v02-2x7b | |
<!-- ### quantize_version: 2 --> | |
<!-- ### output_tensor_quantised: 1 --> | |
<!-- ### convert_type: hf --> | |
<!-- ### vocab_type: --> | |
static quants of https://huggingface.co/frost19k/dolphin-2.8-mistral-v02-2x7b | |