Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

2.65bpw-h6-exl2 quant of DavidAU's L3.1-RP-Hero-Dirty_Harry-8B

Link to orginal model and creator: https://huggingface.co/DavidAU/L3.1-RP-Hero-Dirty_Harry-8B-GGUF

Downloads last month
22
Inference API
Unable to determine this model's library. Check the docs .

Model tree for James2313123/L3.1-RP-Hero-Dirty_Harry-8B_2.65bpw-h6-exl2

Quantized
(7)
this model