smaug-bpe
#1
by
xxx31dingdong
- opened
Why is this model using smaug-bpe instead of llama-bpe? Is this intentional?
It is an issue with llamacpp, you can easily fix using an inbuilt script to change the pre-tokenizer. Or you could just download my model quants from mradermacher which work out of the box.