omarmomen commited on
Commit
c5b4bfb
1 Parent(s): efff561

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -0
README.md ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - omarmomen/babylm_10M
5
+ language:
6
+ - en
7
+ metrics:
8
+ - perplexity
9
+ library_name: transformers
10
+ pipeline_tag: fill-mask
11
+ ---
12
+ # Model Card for omarmomen/sf_babylm_1
13
+
14
+ This model is part of the experiments in my master's thesis titled "Linguistic Structure Induction from Language Models" (https://arxiv.org/abs/2403.09714).
15
+
16
+ "omarmomen/sf_babylm_1" is the StructFormer (SF_m=0) referred to in Chapter 5 (p. 59), where the parser network is position ahead of all the attention blocks.
17
+
18
+ The model is trained on the BabyLM 10M dataset, with a RobertaTokenizer pretrained on the BabyLM 10M dataset with 16K tokens (https://huggingface.co/omarmomen/babylm_bpe_tokenizer_16k).
19
+