Edit model card

EXL2 Quantization of l2-13b-thespurral-v1.

GGUF here

Model details

Quantized at 5.33bpw and 6.13bpw

Visit the model repo for more details.

cato

Downloads last month
8
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Collection including R136a1/l2-13b-thespurral-v1-exl2