Llama-2 models don't work since they have auth token required. I have an auth token but it is not doisplaying

#16
by sayambhu - opened

As written, try to find a way to include gated models

I'm having the same issue. Any updates?

accelerate org

Can you precise a little more what is the issue in your case. You should be able to pass your token in the API Token field.

So in my case, I'm inputting as the model meta-llama/Llama-2-7b-hf and as token I put in my read token hf_xxxx.

When I visit that model page, I see as expected Gated model You have been granted access to this model.

However, when I run the calculator I get the error:

Error
"Model `meta-llama/Llama-2-7b-hf` had an error, please open a discussion on the model's page with the error message and name: `You are trying to access a gated repo.\nMake sure to request access at https://huggingface.co/meta-llama/Llama-2-7b-hf and pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`.`"

Thanks for the traceback ! I am able to reproduce the error on my side too. We will reboot the demo and see if it solves it !

FWIW still seeing it...

Any updates?

accelerate org

Hi all this will be solved now thanks to this PR in Accelerate: https://github.com/huggingface/accelerate/pull/2327

Once merged I'll factory reset the space

muellerzr changed discussion status to closed

Sign up or log in to comment