Inteligent_ai / tokenizer.json
Mxytyu's picture
Create tokenizer.json
18c9785 verified
raw
history blame
157 Bytes
{
"vocab_size": 42,
"special_tokens": ["<banana>", "<potato>", "<squirrel>"],
"unk_token": "<huh?>",
"bos_token": "<go!>",
"eos_token": "<whew>"
}