ValueError: [quantize] The requested number of bits 3 is not supported. The supported bits are 2, 4 and 8.
#2
by
sauterne
- opened
It seems that no support for 3bit? How to use this model?
update your mlx version to v0.21