FileNotFoundError: File not found: DeepSeek-R1-Distill-Qwen-1.5B/tokenizer.model

#5
by JLouisBiz - opened

Trying to quantize it with llama.cpp:

INFO:hf-to-gguf:gguf: rms norm epsilon = 1e-06
INFO:hf-to-gguf:gguf: file type = 1
INFO:hf-to-gguf:Set model tokenizer
INFO:numexpr.utils:NumExpr defaulting to 4 threads.
WARNING:hf-to-gguf:

WARNING:hf-to-gguf:**************************************************************************************
WARNING:hf-to-gguf:** WARNING: The BPE pre-tokenizer was not recognized!
WARNING:hf-to-gguf:**          There are 2 possible reasons for this:
WARNING:hf-to-gguf:**          - the model has not been added to convert_hf_to_gguf_update.py yet
WARNING:hf-to-gguf:**          - the pre-tokenization config has changed upstream
WARNING:hf-to-gguf:**          Check your model files and convert_hf_to_gguf_update.py and update them accordingly.
WARNING:hf-to-gguf:** ref:     https://github.com/ggerganov/llama.cpp/pull/6920
WARNING:hf-to-gguf:**
WARNING:hf-to-gguf:** chkhsh:  b3f499bb4255f8ca19fccd664443283318f2fd2414d5e0b040fbdd0cc195d6c5
WARNING:hf-to-gguf:**************************************************************************************
WARNING:hf-to-gguf:

Traceback (most recent call last):
  File "/home/data1/protected/Programming/llamafile/DeepSeek/../../git/llama.cpp/convert_hf_to_gguf.py", line 2192, in set_vocab
    self._set_vocab_sentencepiece()
  File "/home/data1/protected/Programming/llamafile/DeepSeek/../../git/llama.cpp/convert_hf_to_gguf.py", line 783, in _set_vocab_sentencepiece
    tokens, scores, toktypes = self._create_vocab_sentencepiece()
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/data1/protected/Programming/llamafile/DeepSeek/../../git/llama.cpp/convert_hf_to_gguf.py", line 800, in _create_vocab_sentencepiece
    raise FileNotFoundError(f"File not found: {tokenizer_path}")
FileNotFoundError: File not found: DeepSeek-R1-Distill-Qwen-1.5B/tokenizer.model

Resolved it with new version of llama.cpp

JLouisBiz changed discussion status to closed

Sign up or log in to comment