runtime error

Exit code: 1. Reason: ()`. warnings.warn( /usr/local/lib/python3.10/site-packages/pydantic/_internal/_fields.py:161: UserWarning: Field "model_kwargs" has conflict with protected namespace "model_". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`. warnings.warn( [nltk_data] Downloading package stopwords to [nltk_data] /usr/local/lib/python3.10/site- [nltk_data] packages/llama_index/legacy/_static/nltk_cache... [nltk_data] Unzipping corpora/stopwords.zip. [nltk_data] Downloading package punkt to [nltk_data] /usr/local/lib/python3.10/site- [nltk_data] packages/llama_index/legacy/_static/nltk_cache... [nltk_data] Unzipping tokenizers/punkt.zip. Downloading shards: 0%| | 0/2 [00:00<?, ?it/s] Downloading shards: 50%|█████ | 1/2 [00:08<00:08, 8.27s/it] Downloading shards: 100%|██████████| 2/2 [00:11<00:00, 5.21s/it] Downloading shards: 100%|██████████| 2/2 [00:11<00:00, 5.67s/it] Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:01<00:00, 1.81it/s] Loading checkpoint shards: 100%|██████████| 2/2 [00:01<00:00, 1.81it/s] Traceback (most recent call last): File "/home/user/app/app.py", line 111, in <module> llm = HuggingFaceLLM(context_window=4096, File "/usr/local/lib/python3.10/site-packages/llama_index/llms/huggingface/base.py", line 264, in __init__ f"The model `{model_name}` and tokenizer `{self._tokenizer.name_or_path}` " File "/usr/local/lib/python3.10/site-packages/pydantic/main.py", line 807, in __getattr__ return self.__pydantic_private__[item] # type: ignore File "/usr/local/lib/python3.10/site-packages/pydantic/main.py", line 825, in __getattr__ return super().__getattribute__(item) # Raises AttributeError if appropriate AttributeError: 'HuggingFaceLLM' object has no attribute '__pydantic_private__'. Did you mean: '__pydantic_complete__'?

Container logs:

Fetching error logs...