Exllama v2 Quantizations of StructLM-7B
Using turboderp's ExLlamaV2 v0.0.14 for quantization.
The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)
Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
Original model: https://huggingface.co/TIGER-Lab/StructLM-7B
No GQA - VRAM requirements will be higher
Branch | Bits | lm_head bits | Size (4k) | Size (16k) | Description |
---|---|---|---|---|---|
8_0 | 8.0 | 8.0 | 9.0 GB | 15.2 GB | Maximum quality that ExLlamaV2 can produce, near unquantized performance. |
6_5 | 6.5 | 8.0 | 8.2 GB | 14.4 GB | Near unquantized performance at vastly reduced size, recommended. |
5_0 | 5.0 | 6.0 | 6.8 GB | 13.0 GB | Slightly lower quality vs 6.5, but usable on 8GB cards with 4k context. |
4_25 | 4.25 | 6.0 | 6.1 GB | 12.3 GB | GPTQ equivalent bits per weight. |
3_5 | 3.5 | 6.0 | 5.5 GB | 11.7 GB | Lower quality, not recommended. |
Download instructions
With git:
git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/StructLM-7B-exl2
With huggingface hub (credit to TheBloke for instructions):
pip3 install huggingface-hub
To download the main
(only useful if you only care about measurement.json) branch to a folder called StructLM-7B-exl2
:
mkdir StructLM-7B-exl2
huggingface-cli download bartowski/StructLM-7B-exl2 --local-dir StructLM-7B-exl2 --local-dir-use-symlinks False
To download from a different branch, add the --revision
parameter:
Linux:
mkdir StructLM-7B-exl2-6_5
huggingface-cli download bartowski/StructLM-7B-exl2 --revision 6_5 --local-dir StructLM-7B-exl2-6_5 --local-dir-use-symlinks False
Windows (which apparently doesn't like _ in folders sometimes?):
mkdir StructLM-7B-exl2-6.5
huggingface-cli download bartowski/StructLM-7B-exl2 --revision 6_5 --local-dir StructLM-7B-exl2-6.5 --local-dir-use-symlinks False