TypeError: RefinedWebModel isn't supported yet.
This is what my code is
from transformers import AutoTokenizer
from auto_gptq import AutoGPTQForCausalLM
# Download the model from HF and store it locally, then reference its location here:
#quantized_model_dir = model_path
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("TheBloke/falcon-7b-instruct-GPTQ", use_fast=False)
model = AutoGPTQForCausalLM.from_quantized("TheBloke/falcon-7b-instruct-GPTQ", device="cuda:1", use_triton=False, use_safetensors=True, trust_remote_code=True)
When I Run This, i Got These:
- configuration_RW.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
โญโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ Traceback (most recent call last) โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ in <cell line: 11>:11 โ
โ โ
โ /usr/local/lib/python3.10/dist-packages/auto_gptq/modeling/auto.py:62 in from_quantized โ
โ โ
โ 59 โ โ model_basename: Optional[str] = None, โ
โ 60 โ โ trust_remote_code: bool = False โ
โ 61 โ ) -> BaseGPTQForCausalLM: โ
โ โฑ 62 โ โ model_type = check_and_get_model_type(save_dir) โ
โ 63 โ โ return GPTQ_CAUSAL_LM_MODEL_MAP[model_type].from_quantized( โ
โ 64 โ โ โ save_dir=save_dir, โ
โ 65 โ โ โ device=device, โ
โ โ
โ /usr/local/lib/python3.10/dist-packages/auto_gptq/modeling/_utils.py:124 in โ
โ check_and_get_model_type โ
โ โ
โ 121 def check_and_get_model_type(model_dir): โ
โ 122 โ config = AutoConfig.from_pretrained(model_dir, trust_remote_code=True) โ
โ 123 โ if config.model_type not in SUPPORTED_MODELS: โ
โ โฑ 124 โ โ raise TypeError(f"{config.model_type} isn't supported yet.") โ
โ 125 โ model_type = config.model_type โ
โ 126 โ return model_type โ
โ 127 โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
TypeError: RefinedWebModel isn't supported yet.
How Should I Get Rid of These?
Can SomeBody Help?
@TheBloke can you please chime in. Thanks!
Same issue...Please helppppp....
You must not be using the latest version of AutoGPTQ. Please ensure you're running version 0.2.0, released yesterday.
pip install auto-gptq --upgrade --upgrade-strategy only-if-needed
You must not be using the latest version of AutoGPTQ. Please ensure you're running version 0.2.0, released yesterday.
pip install auto-gptq --upgrade --upgrade-strategy only-if-needed
Not Worked. Still Getting The Same Issue
OK I am checking it
You must not be using the latest version of AutoGPTQ. Please ensure you're running version 0.2.0, released yesterday.
pip install auto-gptq --upgrade --upgrade-strategy only-if-needed
I Upgraded the Package using that command.
After that i Checked the version using pip show auto-gptq
Output: ```Name: auto-gptq
Version: 0.1.0
Summary:
Home-page:
Author:
Author-email:
License:
Location: /usr/local/lib/python3.10/dist-packages
Requires: accelerate, datasets, numpy, rouge, safetensors, torch, transformers
Required-by:
The Version is still 0.1.0.
Someone Please Help me Make This!
i used Eric's model too... its too Showing the Same RefinedWeb Model Not Supported Error
It works for me.
Please try:
pip uninstall auto-gptq
pip install -v auto-gptq==0.2.0
Copy all the output, then test again. And if it doesn't work, please paste the output here. When pasting, type ''', hit enter, paste your log, hit enter, then type ''' again. This will enclose the log in a code block so it's easier to read.
If you have version 0.1.0 then that's definitely the problem. It should be version 0.2.0
It works for me.
Please try:
pip uninstall auto-gptq pip install -v auto-gptq==0.2.0
Copy all the output, then test again. And if it doesn't work, please paste the output here. When pasting, type ''', hit enter, paste your log, hit enter, then type ''' again. This will enclose the log in a code block so it's easier to read.
After Running this It Says No matching distribution found for auto-gptq==0.2.0
Log:
Found existing installation: auto-gptq 0.1.0
Uninstalling auto-gptq-0.1.0:
Successfully uninstalled auto-gptq-0.1.0
Using pip 23.1.2 from /usr/local/lib/python3.10/dist-packages/pip (python 3.10)
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting auto-gptq==0.2.0
Using cached auto_gptq-0.2.0.tar.gz (47 kB)
Running command python setup.py egg_info
running egg_info
creating /tmp/pip-pip-egg-info-d0sklosj/auto_gptq.egg-info
writing /tmp/pip-pip-egg-info-d0sklosj/auto_gptq.egg-info/PKG-INFO
writing dependency_links to /tmp/pip-pip-egg-info-d0sklosj/auto_gptq.egg-info/dependency_links.txt
writing requirements to /tmp/pip-pip-egg-info-d0sklosj/auto_gptq.egg-info/requires.txt
writing top-level names to /tmp/pip-pip-egg-info-d0sklosj/auto_gptq.egg-info/top_level.txt
writing manifest file '/tmp/pip-pip-egg-info-d0sklosj/auto_gptq.egg-info/SOURCES.txt'
/usr/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py:476: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend.
warnings.warn(msg.format('we could not find ninja.'))
reading manifest file '/tmp/pip-pip-egg-info-d0sklosj/auto_gptq.egg-info/SOURCES.txt'
adding license file 'LICENSE'
writing manifest file '/tmp/pip-pip-egg-info-d0sklosj/auto_gptq.egg-info/SOURCES.txt'
Preparing metadata (setup.py) ... done
Discarding https://files.pythonhosted.org/packages/b1/f9/97153ae5cf926f96fd37e61424a1bb58e0c9991cc220b2e17390fb8bde97/auto_gptq-0.2.0.tar.gz (from https://pypi.org/simple/auto-gptq/) (requires-python:>=3.8.0): Requested auto-gptq==0.2.0 from https://files.pythonhosted.org/packages/b1/f9/97153ae5cf926f96fd37e61424a1bb58e0c9991cc220b2e17390fb8bde97/auto_gptq-0.2.0.tar.gz has inconsistent version: expected '0.2.0', but metadata has '0.2.0+cu1180'
ERROR: Could not find a version that satisfies the requirement auto-gptq==0.2.0 (from versions: 0.0.4, 0.0.5, 0.1.0, 0.2.0)
ERROR: No matching distribution found for auto-gptq==0.2.0
What OS are you using?
What OS are you using?
colab sir!
Could not find a version that satisfies the requirement auto-gptq==0.2.0 (from versions: 0.0.4, 0.0.5, 0.1.0, 0.2.0)
: |
All right I think this is a problem in the AutoGPTQ repo that will need to be fixed there. Please build it from source:
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip install .
I think i Should Compile From the Source?
All right I think this is a problem in the AutoGPTQ repo that will need to be fixed there. Please build it from source:
git clone https://github.com/PanQiWei/AutoGPTQ cd AutoGPTQ pip install .
This Need Cuda Developer Pack, Right?
The cuda version on colab gpu is 12.0. AutoGPTQ has Precompiled Lib for only cuda11.7 and 11.8. For 12.0 it should be compiled from the source. This Might be a Problem
Ok. Now I Have auto-gptq version 0.2.0 for cuda+118.
Version: 0.2.0+cu1180
Summary: An easy-to-use LLMs quantization package with user-friendly apis, based on GPTQ algorithm.
Home-page: https://github.com/PanQiWei/AutoGPTQ
Author: PanQiWei
Author-email:
License:
Location: /usr/local/lib/python3.10/dist-packages
Requires: accelerate, datasets, numpy, rouge, safetensors, torch, transformers
Required-by:
But colab cuda version is 12.0. I don't this will work or not.
Let Me Try and come here.
You don't have CUDA toolkit 12.0 installed, you have 11.8. And compiling with source will work fine
PS. I have raised this problem as an Issue on the AutoGPTQ repo: https://github.com/PanQiWei/AutoGPTQ/issues/124
Please track it for future
I am facing the below error while using the ToRA. any support or guidance would be great
ImportError: Found an incompatible version of auto-gptq. Found version 0.4.2+cu1180, but only version above 0.4.99 are supported
I am facing the below error while using the ToRA. any support or guidance would be great
ImportError: Found an incompatible version of auto-gptq. Found version 0.4.2+cu1180, but only version above 0.4.99 are supported
same
ImportError: Found an incompatible version of auto-gptq. Found version 0.4.2+cu121, but only version above 0.4.99 are supported