runtime error

Exit code: 1. Reason: /usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:810: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead. warnings.warn( Traceback (most recent call last): File "/home/user/app/app.py", line 10, in <module> tokenizer = AutoTokenizer.from_pretrained(model_name, use_auth_token=hf_token) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 940, in from_pretrained return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2016, in from_pretrained raise EnvironmentError( OSError: Can't load tokenizer for 'bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF' is the correct path to a directory containing all relevant files for a BartTokenizerFast tokenizer.

Container logs:

Fetching error logs...