ValueError: Cannot instantiate this tokenizer from a slow version. If it's based on sentencepiece, make sure you have sentencepiece installed.

#62
by Berserq - opened

Anyone overcome this error?

You should run: pip install sentencepiece

Yes thank you. That's right. I did that and it worked for me. I forgot to update my post.

Sign up or log in to comment