GPTQ quantization of https://huggingface.co/TehVenom/Dolly_Shygmalion-6b
Using this repository: https://github.com/mayaeary/GPTQ-for-LLaMa/tree/gptj-v2
Command:
python3 gptj.py models/Dolly_Shygmalion-6b c4 --wbits 4 --groupsize 128 --save_safetensors models/Dolly_Shygmalion-6b-4bit-128g.safetensors
- Downloads last month
- 19
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API has been turned off for this model.