set `add_special_tokens=True` when using the model with a "text-generation" pipeline
#5
by
dcfidalgo
- opened
Just in case someone is trying to use this model with a "text-generation" pipeline: make sure you pass on add_special_tokens=True
, otherwise the model outputs nonsense.
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
tokenizer = AutoTokenizer.from_pretrained('NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO', trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
"NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO",
torch_dtype=torch.float16,
device_map="auto",
load_in_8bit=False,
load_in_4bit=True,
use_flash_attention_2=True,
)
prompts = [
"""<|im_start|>system
You are a sentient, superintelligent artificial general intelligence, here to teach and assist me.<|im_end|>
<|im_start|>user
Write a short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world.<|im_end|>
<|im_start|>assistant""",
]
pl = pipeline(task="text-generation", tokenizer=tokenizer, model=model)
print(
pl(prompts[0], max_new_tokens=128, temperature=0.8, repetition_penalty=1.1, do_sample=True, return_full_text=False)
)
# [{'generated_text': "“Ah... that's certainly one way of putting it” said Kirby who together as one now had teams working towards destroying the world.“Wait Kirby!” Said Saiyamin Super Eiko IichiiisimimisssimissimississSimismic! “That's quite a"}]
print(
pl(prompts[0], max_new_tokens=128, temperature=0.8, repetition_penalty=1.1, do_sample=True, return_full_text=False, add_special_tokens=True)
)
# [{'generated_text': '\nIn the serene city of Satan City, all seemed peaceful until an alarming sight was observed — none other than Kirby, who had traveled through space-time via Warp Star, landed on Earth. Unbeknownst to its inhabitants, he carried the nefarious intention of teaming up with'}]
It seems the model is quite sensitive to the special <s>
token in the beginning, which is missing if add_special_tokens=False
.