Update modeling_ltgbert.py
#1
by
KoichiYasuoka
- opened
When initializing LtgbertForTokenClassification
several LayerNorm
s don't have weight
or bias
.
And when using transformers
>=4.40, two Metaspace's in tokenizer.json
need "prepend_scheme" as follows:
{
"type": "Metaspace",
"replacement": "β",
"add_prefix_space": false,
"prepend_scheme": "never"
},
Hi, thank you very much for reporting these issues! I will look more into it next week. We're still discussing what to do about the Metaspace pretokenizer, its new behavior might silently break more things: https://huggingface.co/HPLT/hplt_bert_base_en/discussions/1