--- license: mit datasets: - omarmomen/babylm_10M language: - en metrics: - perplexity library_name: transformers pipeline_tag: fill-mask --- # Model Card for omarmomen/sf_ip_babylm_1 This model is part of the experiments in my master's thesis titled "Linguistic Structure Induction from Language Models" (https://arxiv.org/abs/2403.09714). "omarmomen/sf_ip_babylm_1" is the StructFormer (SF_m=4) referred to in Chapter 5 (p. 59); it is an in-between parser variant with the parser network positioned after 4 transformer blocks. The model is trained on the BabyLM 10M dataset, with a RobertaTokenizer pretrained on the BabyLM 10M dataset with 16K tokens (https://huggingface.co/omarmomen/babylm_bpe_tokenizer_16k). https://arxiv.org/abs/2403.09714