ValueError: [quantize] The requested number of bits 3 is not supported. The supported bits are 2, 4 and 8.

#2
by sauterne - opened

It seems that no support for 3bit? How to use this model?

MLX Community org

update your mlx version to v0.21

Sign up or log in to comment