Update modeling_ltgbert.py

#1

When initializing LtgbertForTokenClassification several LayerNorms don't have weight or bias.

And when using transformers>=4.40, two Metaspace's in tokenizer.json need "prepend_scheme" as follows:

      {
        "type": "Metaspace",
        "replacement": "▁",
        "add_prefix_space": false,
        "prepend_scheme": "never"
      },
HPLT org

Hi, thank you very much for reporting these issues! I will look more into it next week. We're still discussing what to do about the Metaspace pretokenizer, its new behavior might silently break more things: https://huggingface.co/HPLT/hplt_bert_base_en/discussions/1

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment