TypeError: internlm isn't supported yet
#1
by
yuxuan2022
- opened
执行到这一句的时候
model = AutoGPTQForCausalLM.from_quantized("cczhong/internlm-chat-7b-4bit-gptq", trust_remote_code=True)
报错 TypeError: internlm isn't supported yet
从github安装 autogptq pip install git+https://github.com/PanQiWei/AutoGPTQ