ValueError
ValueError: rope_scaling
must be a dictionary with two fields, type
and factor
, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
after changing to"rope_scaling": { "factor": 8.0, "type": "dynamic" },
in config.json
it procedes further
then... Lib\site-packages\transformers\integrations\awq.py", line 354, in _fuse_awq_mlp new_module = target_cls(gate_proj, down_proj, up_proj, activation_fn) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "d:\code\autoawq\awq\modules\fused\mlp.py", line 41, in __init__ self.linear = awq_ext.gemm_forward_cuda ^^^^^^^ NameError: name 'awq_ext' is not defined
;D
Hi here
@onur48
and
@froilo
indeed you should update the transformers
version that you're using to fix the issue, as transformers
4.43.0 implements the RoPE scaling required for the Llama 3.1 architecture, so the following should work:
pip install --upgrade transformers
See the release notes at https://github.com/huggingface/transformers/releases/tag/v4.43.0; and note that the --upgrade
flag will install https://github.com/huggingface/transformers/releases/tag/v4.43.3 instead, that also comes with follow up fixes! 🤗