runtime error

python Found existing installation: llama_cpp_python 0.2.14 Uninstalling llama_cpp_python-0.2.14: Successfully uninstalled llama_cpp_python-0.2.14 Successfully installed diskcache-5.6.3 llama-cpp-python-0.2.20 numpy-1.26.2 typing-extensions-4.8.0 [notice] A new release of pip available: 22.3.1 -> 23.3.1 [notice] To update, run: python -m pip install --upgrade pip --2023-12-09 05:53:28-- https://huggingface.co/TheBloke/Leo-Mistral-Hessianai-7B-Chat-GGUF/blob/main/leo-mistral-hessianai-7b-chat.Q4_K_M.gguf Resolving huggingface.co (huggingface.co)... 10.0.117.141, 10.0.151.27, 10.0.189.100, ... Connecting to huggingface.co (huggingface.co)|10.0.117.141|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 68953 (67K) [text/html] Saving to: ‘leo-mistral-hessianai-7b-chat.Q4_K_M.gguf’ 0K .......... .......... .......... .......... .......... 74% 322M 0s 50K .......... ....... 100% 407M=0s 2023-12-09 05:53:28 (341 MB/s) - ‘leo-mistral-hessianai-7b-chat.Q4_K_M.gguf’ saved [68953/68953] gguf_init_from_file: invalid magic characters <!DO. error loading model: llama_model_loader: failed to load model from leo-mistral-hessianai-7b-chat.Q4_K_M.gguf llama_load_model_from_file: failed to load model AVX = 1 | AVX2 = 1 | AVX512 = 1 | AVX512_VBMI = 1 | AVX512_VNNI = 1 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | Traceback (most recent call last): File "/home/user/app/app.py", line 16, in <module> llm = Llama(model_path=model_file, model_type="mistral") File "/home/user/.local/lib/python3.10/site-packages/llama_cpp/llama.py", line 951, in __init__ self._n_vocab = self.n_vocab() File "/home/user/.local/lib/python3.10/site-packages/llama_cpp/llama.py", line 2258, in n_vocab return self._model.n_vocab() File "/home/user/.local/lib/python3.10/site-packages/llama_cpp/llama.py", line 250, in n_vocab assert self.model is not None AssertionError

Container logs:

Fetching error logs...