Model Access not working

#60
by rujhanjain07 - opened

I am an AI/ML Engineer/Researcher, and I regularly require Llama models for any research and development purposes, for the same I filled the agreements for any llama model I needed like Llama2 7b chat, and I was granted access to it, and for some months I was able to download the model using the code below:
def download_llama2():
# Authenticate using your Hugging Face API token
login(token="[My_HF_Token]", add_to_git_credential=True)
repo_id = 'meta-llama/Llama-2-7b-chat-hf'
local_dir = "/content/llama_metadata"
filenames = [
"model-00001-of-00002.safetensors",
"model-00002-of-00002.safetensors",
"model.safetensors.index.json",
"config.json",
"generation_config.json",
"tokenizer.json",
"tokenizer.model",
"special_tokens_map.json",
"tokenizer_config.json"
]
for filename in filenames:
hf_hub_download(repo_id=repo_id, filename=filename, local_dir=local_dir)
I had also been able to directly load the models using transformers library's pipeline, and the basic classes available in transformers to load such models.
But recently, whenever I try to either download the model or load it directly, it shows the below error:

HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-11B-Vision/resolve/main/config.json

The above exception was the direct cause of the following exception:

GatedRepoError Traceback (most recent call last)
GatedRepoError: 401 Client Error. (Request ID: Root=1-677e2bf8-0d1c98863fa7a895331fff6f;56da8764-3fe5-45a8-8ca2-744e24c09693)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-11B-Vision/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-11B-Vision is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

OSError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_gated_repo, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash, **deprecated_kwargs)
419 if resolved_file is not None or not _raise_exceptions_for_gated_repo:
420 return resolved_file
--> 421 raise EnvironmentError(
422 "You are trying to access a gated repo.\nMake sure to have access to it at "
423 f"https://huggingface.co/{path_or_repo_id}.\n{str(e)}"

OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-11B-Vision.
401 Client Error. (Request ID: Root=1-677e2bf8-0d1c98863fa7a895331fff6f;56da8764-3fe5-45a8-8ca2-744e24c09693)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-11B-Vision/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-11B-Vision is restricted. You must have access to it and be authenticated to access it. Please log in.

I do have access to meta-llama/Llama-3.2-11B-Vision as well, and in the repository it does show me the access granted, but in code it always gives me this error. Does anyone know any way to resolve this. Please let me know if there's some solution to it ASAP.

Thank you.

Sign up or log in to comment