Missing tokens.

#2
by AUTOMATIC - opened

Nous has <|im_end|> and <|im_start|> as seen here: https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO/blob/main/tokenizer_config.json

This merge does not have those tokens. If I add them to config myself, tokenization works, but inference dies with an out of range error for the embedding layer. So it looks like there is no way to use Nous prompt template with this until weights and config are updated.

Sign up or log in to comment