snousias commited on
Commit
ca27ac1
·
0 Parent(s):

Duplicate from snousias/bert-base-greek-uncased-v5-finetuned-polylex-mg

Browse files
.gitattributes ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
.gitignore ADDED
@@ -0,0 +1 @@
 
 
1
+ checkpoint-*/
README.md ADDED
@@ -0,0 +1,601 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: nlpaueb/bert-base-greek-uncased-v1
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: bert-base-greek-uncased-v5-finetuned-polylex-mg
7
+ results: []
8
+ duplicated_from: snousias/bert-base-greek-uncased-v5-finetuned-polylex-mg
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # bert-base-greek-uncased-v5-finetuned-polylex-mg
15
+
16
+ This model is a fine-tuned version of [nlpaueb/bert-base-greek-uncased-v1](https://huggingface.co/nlpaueb/bert-base-greek-uncased-v1) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.3369
19
+
20
+ ## Model description
21
+
22
+ In this work we appropriately adapt a corpus of multiword expressions in Modern Greek, namely PolylexMG, characterised by the features detailed above formulating its spectrum of idiosyncrasy to finetune Greek BERT transformer model for masked language modelling classification and tasks. The GREEK-BERT model is pre-trained on free text corpora extracted from (a) the Greek part of Wikipedia, (b) the Greek part of the European Parliament Proceedings Parallel Corpus (Europarl), and (c) the Greek part of OSCAR (Koutsikakis et al, 2020:113), this monolingual model is based on the architecture of BERT-BASE-UNCASED. Specifically, Greek BERT has been finetuned with expressions derived from each syntactic category as they are described in PolylexMG (Fotopoulou et al., 2023) that includes 6,000 Greek lexical entries dataset entailing frozen idioms which are semantically fixed with no paradigmatic variation (Lamiroy, 2003) and light verb constructions in which the semantics are traced in the predicative noun and not the verb (Anastassiadis-Symeonidis et al., 2020).
23
+
24
+ ## Results, intended uses & limitations
25
+
26
+ This subsection presents the experimental evaluation results for the MWE-fine-tuned Greek BERT model with respect to classification use case. The derived setup assumes that raw text in modern Greek that may contain multiple sentences is processed by the language model and reports class with regards to whether the text segment contains multiword stereotypical expressions or not. We compared the fine-tuned BERT model with a baseline logistic regression model. The latter is using as input the same word embeddings as the MWE-fine-tuned BERT model.
27
+ Greek-MWE-Bert was trained in a masked language model setting with full-expression-subdataset. The model perplexity was measured tο 303.21 before finetuning and 3.81 after finetuning demonstrating that the model has gained domain knowledge on multiword expressions. Qualitative outcomes presented in the following tables demonstrating the model performance in the case of 16 verbal constructs. The qualitative evaluation demonstrates that the fine-tuned model in all cases generates stereotypical multiword expressions while the original Greek BERT yields incomplete free-text related parts of sentences.
28
+
29
+ The finetuned model was further finetuned using for classification-oriented architecture with the classification-task-subdataset. The Bert classifier demonstrates an accuracy equivalent to 80% with a higher precision for free text reaching 80% and lower precision of 79% for MWE. In comparison the baseline classifier yields 70% for the free text and 67% for the MWE. We can observe that the two models have only 10% difference in accuracy despite the simplicity of baseline classifier. We can explain the small difference in the trained GreekBERT tokenizer that is used by both our model and the simplistic logistic regression model. However, the MWE-finetuned-Greek-BERT model can better capture sentences that contain MWEs due to the inherent benefits that the architecture offers.
30
+
31
+
32
+ ## Training and evaluation data
33
+
34
+ ### Sample of PolylexMG full expression subdataset
35
+
36
+ |text |label|
37
+ |-----------------------|-----|
38
+ |αδειάζω τη γωνιά σε |1 |
39
+ |αδειάζω πιστόλι πάνω σε|1 |
40
+ |αλλάζω τον αδόξαστο σε |1 |
41
+ |αλλάζω την πίστη σε |1 |
42
+ |δεν αλλάζω ούτε κόμα σε|1 |
43
+ |αλλάζω λόγια με |1 |
44
+ |αλλάζω κουβέντες με |1 |
45
+ |αλλάζω τα μυαλά σε |1 |
46
+ |αλλάζω τα φώτα σε |1 |
47
+ |αλλάζω τα πετρέλαια σε |1 |
48
+ |αλλάζω τα πέταλα σε |1 |
49
+
50
+
51
+ ### Sample of PolylexMG classification subdataset
52
+
53
+ |text |label|
54
+ |-----------------------|-----|
55
+ |Μέσα σε λίγα λεπτά άναψαν τα αίματα και ο διαπληκτισμός άρχισε να γίνεται όλο και πιο έντονος|1 |
56
+ |Η πρώτη έκπληξη ήρθε αμέσως μόλις άναψαν τα τέσσερα κόκκινα φανάρια και το ένα πράσινο|0 |
57
+ |Γιατί τα κάνετε αυτά, για να γελάνε οι άλλοι μαζί μας;|0 |
58
+ |Κάθε φορά που έμπαινε καλάθι, έβγαζαν τις ίδιες ακριβώς ιαχές για να πάει γούρι και να μην κόψει η μαγιονέζα|1 |
59
+ |Η νέα πυρκαγιά ξεκινά από την πίσω πλευρά του Πεντελικού Όρους, σε σημείο που δεν είχε καεί|0 |
60
+
61
+ ## Training procedure
62
+
63
+ ### Training hyperparameters
64
+
65
+ The following hyperparameters were used during training:
66
+ - learning_rate: 5e-06
67
+ - train_batch_size: 512
68
+ - eval_batch_size: 512
69
+ - seed: 42
70
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
71
+ - lr_scheduler_type: linear
72
+ - num_epochs: 500
73
+
74
+
75
+ ### Training results (Summary)
76
+
77
+ | Training Loss | Epoch | Step | Validation Loss |
78
+ |:-------------:|:-----:|:----:|:---------------:|
79
+ | 5.2105 | 1.0 | 13 | 4.4870 |
80
+ | 4.4319 | 2.0 | 26 | 3.8456 |
81
+ | 4.0318 | 3.0 | 39 | 3.4164 |
82
+ | 3.7558 | 4.0 | 52 | 3.2849 |
83
+ | 1.1307 | 497.0 | 6461 | 1.3311 |
84
+ | 1.1163 | 498.0 | 6474 | 1.3016 |
85
+ | 1.099 | 499.0 | 6487 | 1.3532 |
86
+ | 1.1246 | 500.0 | 6500 | 1.2222 |
87
+
88
+ ### Framework versions
89
+
90
+ - Transformers 4.31.0
91
+ - Pytorch 2.0.1+cu118
92
+ - Datasets 2.14.3
93
+ - Tokenizers 0.13.3
94
+
95
+
96
+ ### Training results (Full)
97
+
98
+ | Training Loss | Epoch | Step | Validation Loss |
99
+ |:-------------:|:-----:|:----:|:---------------:|
100
+ | 5.2105 | 1.0 | 13 | 4.4870 |
101
+ | 4.4319 | 2.0 | 26 | 3.8456 |
102
+ | 4.0318 | 3.0 | 39 | 3.4164 |
103
+ | 3.7558 | 4.0 | 52 | 3.2849 |
104
+ | 3.5626 | 5.0 | 65 | 3.3146 |
105
+ | 3.4355 | 6.0 | 78 | 3.1532 |
106
+ | 3.3299 | 7.0 | 91 | 3.0451 |
107
+ | 3.2313 | 8.0 | 104 | 2.9359 |
108
+ | 3.1758 | 9.0 | 117 | 2.8543 |
109
+ | 3.0762 | 10.0 | 130 | 2.8034 |
110
+ | 3.0318 | 11.0 | 143 | 2.7975 |
111
+ | 2.9481 | 12.0 | 156 | 2.6439 |
112
+ | 2.8848 | 13.0 | 169 | 2.6623 |
113
+ | 2.9002 | 14.0 | 182 | 2.6425 |
114
+ | 2.8435 | 15.0 | 195 | 2.6639 |
115
+ | 2.8451 | 16.0 | 208 | 2.6203 |
116
+ | 2.7987 | 17.0 | 221 | 2.5597 |
117
+ | 2.7522 | 18.0 | 234 | 2.5719 |
118
+ | 2.7194 | 19.0 | 247 | 2.6220 |
119
+ | 2.6923 | 20.0 | 260 | 2.5566 |
120
+ | 2.678 | 21.0 | 273 | 2.4172 |
121
+ | 2.6612 | 22.0 | 286 | 2.5726 |
122
+ | 2.6272 | 23.0 | 299 | 2.4478 |
123
+ | 2.6052 | 24.0 | 312 | 2.4366 |
124
+ | 2.5694 | 25.0 | 325 | 2.3694 |
125
+ | 2.593 | 26.0 | 338 | 2.4324 |
126
+ | 2.548 | 27.0 | 351 | 2.4070 |
127
+ | 2.4954 | 28.0 | 364 | 2.3651 |
128
+ | 2.5097 | 29.0 | 377 | 2.3268 |
129
+ | 2.5041 | 30.0 | 390 | 2.4208 |
130
+ | 2.4919 | 31.0 | 403 | 2.4321 |
131
+ | 2.461 | 32.0 | 416 | 2.3477 |
132
+ | 2.4698 | 33.0 | 429 | 2.4017 |
133
+ | 2.4557 | 34.0 | 442 | 2.3050 |
134
+ | 2.4464 | 35.0 | 455 | 2.3282 |
135
+ | 2.4215 | 36.0 | 468 | 2.3339 |
136
+ | 2.4037 | 37.0 | 481 | 2.2429 |
137
+ | 2.386 | 38.0 | 494 | 2.3452 |
138
+ | 2.3961 | 39.0 | 507 | 2.3312 |
139
+ | 2.3985 | 40.0 | 520 | 2.2921 |
140
+ | 2.3302 | 41.0 | 533 | 2.2711 |
141
+ | 2.3128 | 42.0 | 546 | 2.2344 |
142
+ | 2.3158 | 43.0 | 559 | 2.1982 |
143
+ | 2.2927 | 44.0 | 572 | 2.1473 |
144
+ | 2.3122 | 45.0 | 585 | 2.2317 |
145
+ | 2.2885 | 46.0 | 598 | 2.2060 |
146
+ | 2.2592 | 47.0 | 611 | 2.1943 |
147
+ | 2.2492 | 48.0 | 624 | 2.2361 |
148
+ | 2.2495 | 49.0 | 637 | 2.2059 |
149
+ | 2.2402 | 50.0 | 650 | 2.1461 |
150
+ | 2.241 | 51.0 | 663 | 2.2181 |
151
+ | 2.211 | 52.0 | 676 | 2.0885 |
152
+ | 2.2165 | 53.0 | 689 | 2.1567 |
153
+ | 2.2063 | 54.0 | 702 | 2.2112 |
154
+ | 2.1715 | 55.0 | 715 | 2.2934 |
155
+ | 2.1601 | 56.0 | 728 | 2.0745 |
156
+ | 2.1796 | 57.0 | 741 | 2.1070 |
157
+ | 2.152 | 58.0 | 754 | 2.0930 |
158
+ | 2.1562 | 59.0 | 767 | 2.1106 |
159
+ | 2.125 | 60.0 | 780 | 2.1529 |
160
+ | 2.1318 | 61.0 | 793 | 2.0296 |
161
+ | 2.1194 | 62.0 | 806 | 2.0323 |
162
+ | 2.1396 | 63.0 | 819 | 1.9835 |
163
+ | 2.1108 | 64.0 | 832 | 2.0066 |
164
+ | 2.0874 | 65.0 | 845 | 1.9062 |
165
+ | 2.0754 | 66.0 | 858 | 2.1728 |
166
+ | 2.0928 | 67.0 | 871 | 2.0197 |
167
+ | 2.0835 | 68.0 | 884 | 2.0767 |
168
+ | 2.0684 | 69.0 | 897 | 2.1482 |
169
+ | 2.0505 | 70.0 | 910 | 2.0667 |
170
+ | 2.0564 | 71.0 | 923 | 2.1489 |
171
+ | 2.0478 | 72.0 | 936 | 2.0015 |
172
+ | 2.0478 | 73.0 | 949 | 1.9215 |
173
+ | 2.0316 | 74.0 | 962 | 2.0238 |
174
+ | 2.0171 | 75.0 | 975 | 2.0014 |
175
+ | 2.0248 | 76.0 | 988 | 2.0775 |
176
+ | 2.0066 | 77.0 | 1001 | 2.0390 |
177
+ | 2.0018 | 78.0 | 1014 | 2.0043 |
178
+ | 1.9925 | 79.0 | 1027 | 2.0138 |
179
+ | 1.9614 | 80.0 | 1040 | 1.9499 |
180
+ | 1.9877 | 81.0 | 1053 | 1.9642 |
181
+ | 1.9499 | 82.0 | 1066 | 1.9676 |
182
+ | 1.932 | 83.0 | 1079 | 1.9332 |
183
+ | 1.9353 | 84.0 | 1092 | 1.8787 |
184
+ | 1.9672 | 85.0 | 1105 | 1.9720 |
185
+ | 1.9313 | 86.0 | 1118 | 1.9343 |
186
+ | 1.9292 | 87.0 | 1131 | 1.8964 |
187
+ | 1.9277 | 88.0 | 1144 | 1.9619 |
188
+ | 1.9158 | 89.0 | 1157 | 1.9608 |
189
+ | 1.921 | 90.0 | 1170 | 1.9171 |
190
+ | 1.9191 | 91.0 | 1183 | 1.8871 |
191
+ | 1.8935 | 92.0 | 1196 | 1.8857 |
192
+ | 1.8818 | 93.0 | 1209 | 1.8909 |
193
+ | 1.8782 | 94.0 | 1222 | 1.8951 |
194
+ | 1.9028 | 95.0 | 1235 | 1.9164 |
195
+ | 1.8907 | 96.0 | 1248 | 1.9650 |
196
+ | 1.8626 | 97.0 | 1261 | 1.8906 |
197
+ | 1.8413 | 98.0 | 1274 | 1.8957 |
198
+ | 1.854 | 99.0 | 1287 | 1.9644 |
199
+ | 1.8608 | 100.0 | 1300 | 1.8329 |
200
+ | 1.8623 | 101.0 | 1313 | 1.8693 |
201
+ | 1.7798 | 102.0 | 1326 | 1.8913 |
202
+ | 1.846 | 103.0 | 1339 | 1.7854 |
203
+ | 1.7972 | 104.0 | 1352 | 1.8611 |
204
+ | 1.8443 | 105.0 | 1365 | 1.8482 |
205
+ | 1.791 | 106.0 | 1378 | 1.7168 |
206
+ | 1.7879 | 107.0 | 1391 | 1.8093 |
207
+ | 1.7886 | 108.0 | 1404 | 1.8924 |
208
+ | 1.8192 | 109.0 | 1417 | 1.7715 |
209
+ | 1.7919 | 110.0 | 1430 | 1.7415 |
210
+ | 1.7581 | 111.0 | 1443 | 1.7956 |
211
+ | 1.7873 | 112.0 | 1456 | 1.7213 |
212
+ | 1.7873 | 113.0 | 1469 | 1.7340 |
213
+ | 1.7764 | 114.0 | 1482 | 1.8535 |
214
+ | 1.7612 | 115.0 | 1495 | 1.8554 |
215
+ | 1.7737 | 116.0 | 1508 | 1.8126 |
216
+ | 1.7416 | 117.0 | 1521 | 1.8327 |
217
+ | 1.7648 | 118.0 | 1534 | 1.6832 |
218
+ | 1.7262 | 119.0 | 1547 | 1.6972 |
219
+ | 1.7334 | 120.0 | 1560 | 1.7930 |
220
+ | 1.7172 | 121.0 | 1573 | 1.6962 |
221
+ | 1.7282 | 122.0 | 1586 | 1.8800 |
222
+ | 1.7038 | 123.0 | 1599 | 1.7828 |
223
+ | 1.6935 | 124.0 | 1612 | 1.7646 |
224
+ | 1.758 | 125.0 | 1625 | 1.8069 |
225
+ | 1.7018 | 126.0 | 1638 | 1.6958 |
226
+ | 1.6886 | 127.0 | 1651 | 1.6692 |
227
+ | 1.7004 | 128.0 | 1664 | 1.7256 |
228
+ | 1.6947 | 129.0 | 1677 | 1.7587 |
229
+ | 1.6897 | 130.0 | 1690 | 1.7484 |
230
+ | 1.7037 | 131.0 | 1703 | 1.8455 |
231
+ | 1.6981 | 132.0 | 1716 | 1.7588 |
232
+ | 1.6828 | 133.0 | 1729 | 1.7421 |
233
+ | 1.6596 | 134.0 | 1742 | 1.6933 |
234
+ | 1.6782 | 135.0 | 1755 | 1.7040 |
235
+ | 1.6595 | 136.0 | 1768 | 1.6705 |
236
+ | 1.6567 | 137.0 | 1781 | 1.7744 |
237
+ | 1.6588 | 138.0 | 1794 | 1.6545 |
238
+ | 1.6225 | 139.0 | 1807 | 1.7576 |
239
+ | 1.6394 | 140.0 | 1820 | 1.7256 |
240
+ | 1.6515 | 141.0 | 1833 | 1.6668 |
241
+ | 1.6331 | 142.0 | 1846 | 1.7884 |
242
+ | 1.6367 | 143.0 | 1859 | 1.7093 |
243
+ | 1.6335 | 144.0 | 1872 | 1.7098 |
244
+ | 1.6501 | 145.0 | 1885 | 1.6671 |
245
+ | 1.6192 | 146.0 | 1898 | 1.7073 |
246
+ | 1.6198 | 147.0 | 1911 | 1.6653 |
247
+ | 1.6182 | 148.0 | 1924 | 1.6723 |
248
+ | 1.6172 | 149.0 | 1937 | 1.7293 |
249
+ | 1.6129 | 150.0 | 1950 | 1.6545 |
250
+ | 1.6054 | 151.0 | 1963 | 1.6850 |
251
+ | 1.5967 | 152.0 | 1976 | 1.7064 |
252
+ | 1.6028 | 153.0 | 1989 | 1.5292 |
253
+ | 1.6156 | 154.0 | 2002 | 1.6477 |
254
+ | 1.5965 | 155.0 | 2015 | 1.6110 |
255
+ | 1.5695 | 156.0 | 2028 | 1.7071 |
256
+ | 1.5586 | 157.0 | 2041 | 1.6504 |
257
+ | 1.561 | 158.0 | 2054 | 1.6147 |
258
+ | 1.5643 | 159.0 | 2067 | 1.6941 |
259
+ | 1.5797 | 160.0 | 2080 | 1.7398 |
260
+ | 1.5609 | 161.0 | 2093 | 1.5761 |
261
+ | 1.5465 | 162.0 | 2106 | 1.6003 |
262
+ | 1.5467 | 163.0 | 2119 | 1.5839 |
263
+ | 1.5935 | 164.0 | 2132 | 1.6530 |
264
+ | 1.5439 | 165.0 | 2145 | 1.6743 |
265
+ | 1.559 | 166.0 | 2158 | 1.5143 |
266
+ | 1.5648 | 167.0 | 2171 | 1.6390 |
267
+ | 1.552 | 168.0 | 2184 | 1.5389 |
268
+ | 1.5164 | 169.0 | 2197 | 1.5879 |
269
+ | 1.5342 | 170.0 | 2210 | 1.6785 |
270
+ | 1.5319 | 171.0 | 2223 | 1.6341 |
271
+ | 1.5477 | 172.0 | 2236 | 1.7071 |
272
+ | 1.5364 | 173.0 | 2249 | 1.6268 |
273
+ | 1.5366 | 174.0 | 2262 | 1.7247 |
274
+ | 1.5445 | 175.0 | 2275 | 1.6668 |
275
+ | 1.4916 | 176.0 | 2288 | 1.5756 |
276
+ | 1.509 | 177.0 | 2301 | 1.5412 |
277
+ | 1.5316 | 178.0 | 2314 | 1.6270 |
278
+ | 1.5156 | 179.0 | 2327 | 1.6423 |
279
+ | 1.4918 | 180.0 | 2340 | 1.6112 |
280
+ | 1.4997 | 181.0 | 2353 | 1.5775 |
281
+ | 1.5187 | 182.0 | 2366 | 1.6248 |
282
+ | 1.5254 | 183.0 | 2379 | 1.5884 |
283
+ | 1.4732 | 184.0 | 2392 | 1.5787 |
284
+ | 1.4844 | 185.0 | 2405 | 1.5358 |
285
+ | 1.4882 | 186.0 | 2418 | 1.5144 |
286
+ | 1.478 | 187.0 | 2431 | 1.5223 |
287
+ | 1.5101 | 188.0 | 2444 | 1.5787 |
288
+ | 1.4688 | 189.0 | 2457 | 1.5479 |
289
+ | 1.4815 | 190.0 | 2470 | 1.5141 |
290
+ | 1.4925 | 191.0 | 2483 | 1.5939 |
291
+ | 1.467 | 192.0 | 2496 | 1.5471 |
292
+ | 1.4718 | 193.0 | 2509 | 1.6845 |
293
+ | 1.4699 | 194.0 | 2522 | 1.5943 |
294
+ | 1.4562 | 195.0 | 2535 | 1.4745 |
295
+ | 1.4451 | 196.0 | 2548 | 1.5922 |
296
+ | 1.4451 | 197.0 | 2561 | 1.5856 |
297
+ | 1.4624 | 198.0 | 2574 | 1.5519 |
298
+ | 1.444 | 199.0 | 2587 | 1.6538 |
299
+ | 1.4498 | 200.0 | 2600 | 1.5037 |
300
+ | 1.4285 | 201.0 | 2613 | 1.5539 |
301
+ | 1.4439 | 202.0 | 2626 | 1.5387 |
302
+ | 1.4177 | 203.0 | 2639 | 1.5756 |
303
+ | 1.436 | 204.0 | 2652 | 1.6136 |
304
+ | 1.4184 | 205.0 | 2665 | 1.5014 |
305
+ | 1.43 | 206.0 | 2678 | 1.4983 |
306
+ | 1.4347 | 207.0 | 2691 | 1.5896 |
307
+ | 1.39 | 208.0 | 2704 | 1.5506 |
308
+ | 1.4198 | 209.0 | 2717 | 1.5142 |
309
+ | 1.4101 | 210.0 | 2730 | 1.4930 |
310
+ | 1.4219 | 211.0 | 2743 | 1.4814 |
311
+ | 1.4039 | 212.0 | 2756 | 1.3750 |
312
+ | 1.4479 | 213.0 | 2769 | 1.5330 |
313
+ | 1.4354 | 214.0 | 2782 | 1.5179 |
314
+ | 1.4163 | 215.0 | 2795 | 1.5970 |
315
+ | 1.4459 | 216.0 | 2808 | 1.4755 |
316
+ | 1.3714 | 217.0 | 2821 | 1.4230 |
317
+ | 1.3957 | 218.0 | 2834 | 1.5087 |
318
+ | 1.396 | 219.0 | 2847 | 1.5570 |
319
+ | 1.3866 | 220.0 | 2860 | 1.4955 |
320
+ | 1.4122 | 221.0 | 2873 | 1.4272 |
321
+ | 1.371 | 222.0 | 2886 | 1.5209 |
322
+ | 1.3907 | 223.0 | 2899 | 1.4725 |
323
+ | 1.3856 | 224.0 | 2912 | 1.5021 |
324
+ | 1.4053 | 225.0 | 2925 | 1.4880 |
325
+ | 1.4074 | 226.0 | 2938 | 1.4988 |
326
+ | 1.3827 | 227.0 | 2951 | 1.5527 |
327
+ | 1.4045 | 228.0 | 2964 | 1.5350 |
328
+ | 1.3626 | 229.0 | 2977 | 1.5093 |
329
+ | 1.3795 | 230.0 | 2990 | 1.4497 |
330
+ | 1.3973 | 231.0 | 3003 | 1.5106 |
331
+ | 1.3703 | 232.0 | 3016 | 1.4619 |
332
+ | 1.3942 | 233.0 | 3029 | 1.4553 |
333
+ | 1.3447 | 234.0 | 3042 | 1.5061 |
334
+ | 1.3438 | 235.0 | 3055 | 1.5167 |
335
+ | 1.3496 | 236.0 | 3068 | 1.4060 |
336
+ | 1.3614 | 237.0 | 3081 | 1.4211 |
337
+ | 1.3618 | 238.0 | 3094 | 1.4624 |
338
+ | 1.359 | 239.0 | 3107 | 1.4450 |
339
+ | 1.3657 | 240.0 | 3120 | 1.4795 |
340
+ | 1.3599 | 241.0 | 3133 | 1.4887 |
341
+ | 1.3532 | 242.0 | 3146 | 1.4606 |
342
+ | 1.3528 | 243.0 | 3159 | 1.4225 |
343
+ | 1.3445 | 244.0 | 3172 | 1.3912 |
344
+ | 1.3344 | 245.0 | 3185 | 1.4055 |
345
+ | 1.3358 | 246.0 | 3198 | 1.5152 |
346
+ | 1.3591 | 247.0 | 3211 | 1.4825 |
347
+ | 1.3162 | 248.0 | 3224 | 1.4721 |
348
+ | 1.3197 | 249.0 | 3237 | 1.4375 |
349
+ | 1.3358 | 250.0 | 3250 | 1.4644 |
350
+ | 1.3374 | 251.0 | 3263 | 1.4449 |
351
+ | 1.3548 | 252.0 | 3276 | 1.4405 |
352
+ | 1.3266 | 253.0 | 3289 | 1.5357 |
353
+ | 1.3172 | 254.0 | 3302 | 1.3515 |
354
+ | 1.3089 | 255.0 | 3315 | 1.4408 |
355
+ | 1.3209 | 256.0 | 3328 | 1.3895 |
356
+ | 1.3047 | 257.0 | 3341 | 1.4508 |
357
+ | 1.2877 | 258.0 | 3354 | 1.3954 |
358
+ | 1.3409 | 259.0 | 3367 | 1.4417 |
359
+ | 1.31 | 260.0 | 3380 | 1.5124 |
360
+ | 1.3229 | 261.0 | 3393 | 1.4047 |
361
+ | 1.3275 | 262.0 | 3406 | 1.3780 |
362
+ | 1.295 | 263.0 | 3419 | 1.4209 |
363
+ | 1.3279 | 264.0 | 3432 | 1.3867 |
364
+ | 1.291 | 265.0 | 3445 | 1.4694 |
365
+ | 1.2839 | 266.0 | 3458 | 1.5100 |
366
+ | 1.3064 | 267.0 | 3471 | 1.3646 |
367
+ | 1.3086 | 268.0 | 3484 | 1.4390 |
368
+ | 1.3381 | 269.0 | 3497 | 1.4367 |
369
+ | 1.3333 | 270.0 | 3510 | 1.4078 |
370
+ | 1.2775 | 271.0 | 3523 | 1.5213 |
371
+ | 1.2989 | 272.0 | 3536 | 1.4341 |
372
+ | 1.2759 | 273.0 | 3549 | 1.5165 |
373
+ | 1.2796 | 274.0 | 3562 | 1.4705 |
374
+ | 1.3037 | 275.0 | 3575 | 1.3945 |
375
+ | 1.3132 | 276.0 | 3588 | 1.4560 |
376
+ | 1.2816 | 277.0 | 3601 | 1.4123 |
377
+ | 1.2934 | 278.0 | 3614 | 1.3742 |
378
+ | 1.2873 | 279.0 | 3627 | 1.3824 |
379
+ | 1.2842 | 280.0 | 3640 | 1.3269 |
380
+ | 1.2617 | 281.0 | 3653 | 1.4345 |
381
+ | 1.2661 | 282.0 | 3666 | 1.4682 |
382
+ | 1.3096 | 283.0 | 3679 | 1.3989 |
383
+ | 1.2724 | 284.0 | 3692 | 1.3142 |
384
+ | 1.2529 | 285.0 | 3705 | 1.2795 |
385
+ | 1.2611 | 286.0 | 3718 | 1.3844 |
386
+ | 1.2578 | 287.0 | 3731 | 1.3536 |
387
+ | 1.2854 | 288.0 | 3744 | 1.3770 |
388
+ | 1.2811 | 289.0 | 3757 | 1.3892 |
389
+ | 1.2189 | 290.0 | 3770 | 1.3767 |
390
+ | 1.283 | 291.0 | 3783 | 1.4034 |
391
+ | 1.2684 | 292.0 | 3796 | 1.3867 |
392
+ | 1.241 | 293.0 | 3809 | 1.3572 |
393
+ | 1.2503 | 294.0 | 3822 | 1.3583 |
394
+ | 1.2605 | 295.0 | 3835 | 1.4600 |
395
+ | 1.2697 | 296.0 | 3848 | 1.2754 |
396
+ | 1.2469 | 297.0 | 3861 | 1.4295 |
397
+ | 1.2451 | 298.0 | 3874 | 1.4645 |
398
+ | 1.2765 | 299.0 | 3887 | 1.3605 |
399
+ | 1.2482 | 300.0 | 3900 | 1.4915 |
400
+ | 1.2564 | 301.0 | 3913 | 1.3490 |
401
+ | 1.233 | 302.0 | 3926 | 1.3273 |
402
+ | 1.2313 | 303.0 | 3939 | 1.3861 |
403
+ | 1.2491 | 304.0 | 3952 | 1.4016 |
404
+ | 1.2607 | 305.0 | 3965 | 1.3714 |
405
+ | 1.2548 | 306.0 | 3978 | 1.3572 |
406
+ | 1.2536 | 307.0 | 3991 | 1.3630 |
407
+ | 1.24 | 308.0 | 4004 | 1.3070 |
408
+ | 1.2352 | 309.0 | 4017 | 1.4311 |
409
+ | 1.2643 | 310.0 | 4030 | 1.2794 |
410
+ | 1.2281 | 311.0 | 4043 | 1.3855 |
411
+ | 1.2428 | 312.0 | 4056 | 1.3784 |
412
+ | 1.2196 | 313.0 | 4069 | 1.3430 |
413
+ | 1.2116 | 314.0 | 4082 | 1.4230 |
414
+ | 1.2261 | 315.0 | 4095 | 1.4760 |
415
+ | 1.25 | 316.0 | 4108 | 1.3658 |
416
+ | 1.2281 | 317.0 | 4121 | 1.3563 |
417
+ | 1.2308 | 318.0 | 4134 | 1.3107 |
418
+ | 1.2247 | 319.0 | 4147 | 1.3554 |
419
+ | 1.2354 | 320.0 | 4160 | 1.3956 |
420
+ | 1.2168 | 321.0 | 4173 | 1.2753 |
421
+ | 1.2078 | 322.0 | 4186 | 1.3253 |
422
+ | 1.2481 | 323.0 | 4199 | 1.3025 |
423
+ | 1.2331 | 324.0 | 4212 | 1.3707 |
424
+ | 1.1974 | 325.0 | 4225 | 1.2874 |
425
+ | 1.212 | 326.0 | 4238 | 1.3210 |
426
+ | 1.225 | 327.0 | 4251 | 1.4129 |
427
+ | 1.2161 | 328.0 | 4264 | 1.3364 |
428
+ | 1.2304 | 329.0 | 4277 | 1.3822 |
429
+ | 1.1903 | 330.0 | 4290 | 1.4887 |
430
+ | 1.2208 | 331.0 | 4303 | 1.2687 |
431
+ | 1.229 | 332.0 | 4316 | 1.3730 |
432
+ | 1.205 | 333.0 | 4329 | 1.3521 |
433
+ | 1.2023 | 334.0 | 4342 | 1.3770 |
434
+ | 1.2151 | 335.0 | 4355 | 1.3095 |
435
+ | 1.2255 | 336.0 | 4368 | 1.3003 |
436
+ | 1.2205 | 337.0 | 4381 | 1.2123 |
437
+ | 1.203 | 338.0 | 4394 | 1.2995 |
438
+ | 1.2013 | 339.0 | 4407 | 1.2838 |
439
+ | 1.1997 | 340.0 | 4420 | 1.3023 |
440
+ | 1.2033 | 341.0 | 4433 | 1.3111 |
441
+ | 1.1934 | 342.0 | 4446 | 1.4057 |
442
+ | 1.1832 | 343.0 | 4459 | 1.3468 |
443
+ | 1.2405 | 344.0 | 4472 | 1.3362 |
444
+ | 1.1803 | 345.0 | 4485 | 1.4813 |
445
+ | 1.2154 | 346.0 | 4498 | 1.3207 |
446
+ | 1.2314 | 347.0 | 4511 | 1.3236 |
447
+ | 1.1927 | 348.0 | 4524 | 1.3428 |
448
+ | 1.2194 | 349.0 | 4537 | 1.3533 |
449
+ | 1.1995 | 350.0 | 4550 | 1.3465 |
450
+ | 1.177 | 351.0 | 4563 | 1.3484 |
451
+ | 1.1993 | 352.0 | 4576 | 1.2859 |
452
+ | 1.1687 | 353.0 | 4589 | 1.2699 |
453
+ | 1.2045 | 354.0 | 4602 | 1.3686 |
454
+ | 1.2084 | 355.0 | 4615 | 1.3515 |
455
+ | 1.1837 | 356.0 | 4628 | 1.2735 |
456
+ | 1.1937 | 357.0 | 4641 | 1.2835 |
457
+ | 1.2004 | 358.0 | 4654 | 1.2793 |
458
+ | 1.1838 | 359.0 | 4667 | 1.2798 |
459
+ | 1.2026 | 360.0 | 4680 | 1.3856 |
460
+ | 1.1669 | 361.0 | 4693 | 1.3719 |
461
+ | 1.1716 | 362.0 | 4706 | 1.2613 |
462
+ | 1.1906 | 363.0 | 4719 | 1.2719 |
463
+ | 1.1914 | 364.0 | 4732 | 1.3864 |
464
+ | 1.1874 | 365.0 | 4745 | 1.3255 |
465
+ | 1.1848 | 366.0 | 4758 | 1.2984 |
466
+ | 1.1778 | 367.0 | 4771 | 1.3461 |
467
+ | 1.1964 | 368.0 | 4784 | 1.3320 |
468
+ | 1.16 | 369.0 | 4797 | 1.2962 |
469
+ | 1.1873 | 370.0 | 4810 | 1.3035 |
470
+ | 1.1632 | 371.0 | 4823 | 1.3465 |
471
+ | 1.1807 | 372.0 | 4836 | 1.3453 |
472
+ | 1.1331 | 373.0 | 4849 | 1.3527 |
473
+ | 1.1694 | 374.0 | 4862 | 1.2928 |
474
+ | 1.1615 | 375.0 | 4875 | 1.3519 |
475
+ | 1.1944 | 376.0 | 4888 | 1.4072 |
476
+ | 1.163 | 377.0 | 4901 | 1.3156 |
477
+ | 1.1719 | 378.0 | 4914 | 1.3074 |
478
+ | 1.1721 | 379.0 | 4927 | 1.3121 |
479
+ | 1.1618 | 380.0 | 4940 | 1.3039 |
480
+ | 1.1852 | 381.0 | 4953 | 1.3562 |
481
+ | 1.1838 | 382.0 | 4966 | 1.3383 |
482
+ | 1.1616 | 383.0 | 4979 | 1.2922 |
483
+ | 1.1401 | 384.0 | 4992 | 1.2676 |
484
+ | 1.165 | 385.0 | 5005 | 1.2625 |
485
+ | 1.1564 | 386.0 | 5018 | 1.1716 |
486
+ | 1.1662 | 387.0 | 5031 | 1.2738 |
487
+ | 1.1761 | 388.0 | 5044 | 1.4011 |
488
+ | 1.1587 | 389.0 | 5057 | 1.3821 |
489
+ | 1.1517 | 390.0 | 5070 | 1.2879 |
490
+ | 1.1699 | 391.0 | 5083 | 1.2898 |
491
+ | 1.149 | 392.0 | 5096 | 1.2710 |
492
+ | 1.1541 | 393.0 | 5109 | 1.2612 |
493
+ | 1.1597 | 394.0 | 5122 | 1.2993 |
494
+ | 1.1449 | 395.0 | 5135 | 1.2522 |
495
+ | 1.1332 | 396.0 | 5148 | 1.3367 |
496
+ | 1.1537 | 397.0 | 5161 | 1.3018 |
497
+ | 1.1789 | 398.0 | 5174 | 1.3705 |
498
+ | 1.169 | 399.0 | 5187 | 1.3128 |
499
+ | 1.1685 | 400.0 | 5200 | 1.3068 |
500
+ | 1.137 | 401.0 | 5213 | 1.2384 |
501
+ | 1.177 | 402.0 | 5226 | 1.2547 |
502
+ | 1.1592 | 403.0 | 5239 | 1.3295 |
503
+ | 1.1477 | 404.0 | 5252 | 1.3415 |
504
+ | 1.1465 | 405.0 | 5265 | 1.2466 |
505
+ | 1.1743 | 406.0 | 5278 | 1.3045 |
506
+ | 1.1386 | 407.0 | 5291 | 1.3124 |
507
+ | 1.1379 | 408.0 | 5304 | 1.2826 |
508
+ | 1.1828 | 409.0 | 5317 | 1.2788 |
509
+ | 1.1353 | 410.0 | 5330 | 1.3787 |
510
+ | 1.1536 | 411.0 | 5343 | 1.2968 |
511
+ | 1.1495 | 412.0 | 5356 | 1.2920 |
512
+ | 1.1424 | 413.0 | 5369 | 1.3238 |
513
+ | 1.158 | 414.0 | 5382 | 1.3301 |
514
+ | 1.1715 | 415.0 | 5395 | 1.2298 |
515
+ | 1.1559 | 416.0 | 5408 | 1.2769 |
516
+ | 1.1399 | 417.0 | 5421 | 1.3263 |
517
+ | 1.186 | 418.0 | 5434 | 1.2924 |
518
+ | 1.1653 | 419.0 | 5447 | 1.3279 |
519
+ | 1.14 | 420.0 | 5460 | 1.2892 |
520
+ | 1.1463 | 421.0 | 5473 | 1.3875 |
521
+ | 1.1406 | 422.0 | 5486 | 1.3136 |
522
+ | 1.1705 | 423.0 | 5499 | 1.2579 |
523
+ | 1.1065 | 424.0 | 5512 | 1.2955 |
524
+ | 1.145 | 425.0 | 5525 | 1.2970 |
525
+ | 1.1538 | 426.0 | 5538 | 1.3030 |
526
+ | 1.1674 | 427.0 | 5551 | 1.3060 |
527
+ | 1.1283 | 428.0 | 5564 | 1.2325 |
528
+ | 1.1683 | 429.0 | 5577 | 1.3085 |
529
+ | 1.1598 | 430.0 | 5590 | 1.2469 |
530
+ | 1.1429 | 431.0 | 5603 | 1.2523 |
531
+ | 1.1552 | 432.0 | 5616 | 1.3124 |
532
+ | 1.1722 | 433.0 | 5629 | 1.2955 |
533
+ | 1.1329 | 434.0 | 5642 | 1.3249 |
534
+ | 1.1486 | 435.0 | 5655 | 1.3245 |
535
+ | 1.124 | 436.0 | 5668 | 1.4052 |
536
+ | 1.1092 | 437.0 | 5681 | 1.2399 |
537
+ | 1.135 | 438.0 | 5694 | 1.2788 |
538
+ | 1.1637 | 439.0 | 5707 | 1.2844 |
539
+ | 1.1712 | 440.0 | 5720 | 1.2531 |
540
+ | 1.1401 | 441.0 | 5733 | 1.2790 |
541
+ | 1.1195 | 442.0 | 5746 | 1.2876 |
542
+ | 1.1524 | 443.0 | 5759 | 1.2565 |
543
+ | 1.1292 | 444.0 | 5772 | 1.1492 |
544
+ | 1.1342 | 445.0 | 5785 | 1.3050 |
545
+ | 1.1628 | 446.0 | 5798 | 1.2911 |
546
+ | 1.1286 | 447.0 | 5811 | 1.3624 |
547
+ | 1.1193 | 448.0 | 5824 | 1.2382 |
548
+ | 1.1521 | 449.0 | 5837 | 1.2717 |
549
+ | 1.1128 | 450.0 | 5850 | 1.2865 |
550
+ | 1.1321 | 451.0 | 5863 | 1.2785 |
551
+ | 1.1707 | 452.0 | 5876 | 1.3514 |
552
+ | 1.1431 | 453.0 | 5889 | 1.3321 |
553
+ | 1.1413 | 454.0 | 5902 | 1.2886 |
554
+ | 1.0983 | 455.0 | 5915 | 1.3165 |
555
+ | 1.1202 | 456.0 | 5928 | 1.2375 |
556
+ | 1.1259 | 457.0 | 5941 | 1.2166 |
557
+ | 1.1353 | 458.0 | 5954 | 1.3579 |
558
+ | 1.1272 | 459.0 | 5967 | 1.2890 |
559
+ | 1.1411 | 460.0 | 5980 | 1.2397 |
560
+ | 1.115 | 461.0 | 5993 | 1.2803 |
561
+ | 1.14 | 462.0 | 6006 | 1.2439 |
562
+ | 1.11 | 463.0 | 6019 | 1.1894 |
563
+ | 1.1539 | 464.0 | 6032 | 1.2979 |
564
+ | 1.1052 | 465.0 | 6045 | 1.2281 |
565
+ | 1.1092 | 466.0 | 6058 | 1.2853 |
566
+ | 1.1229 | 467.0 | 6071 | 1.2988 |
567
+ | 1.1209 | 468.0 | 6084 | 1.3058 |
568
+ | 1.1147 | 469.0 | 6097 | 1.2705 |
569
+ | 1.1228 | 470.0 | 6110 | 1.2435 |
570
+ | 1.1124 | 471.0 | 6123 | 1.2188 |
571
+ | 1.0922 | 472.0 | 6136 | 1.2892 |
572
+ | 1.1228 | 473.0 | 6149 | 1.2250 |
573
+ | 1.1341 | 474.0 | 6162 | 1.2373 |
574
+ | 1.1295 | 475.0 | 6175 | 1.2126 |
575
+ | 1.1105 | 476.0 | 6188 | 1.3032 |
576
+ | 1.1223 | 477.0 | 6201 | 1.2190 |
577
+ | 1.1487 | 478.0 | 6214 | 1.2728 |
578
+ | 1.1288 | 479.0 | 6227 | 1.3258 |
579
+ | 1.1398 | 480.0 | 6240 | 1.2114 |
580
+ | 1.1127 | 481.0 | 6253 | 1.2695 |
581
+ | 1.135 | 482.0 | 6266 | 1.3376 |
582
+ | 1.106 | 483.0 | 6279 | 1.2860 |
583
+ | 1.0978 | 484.0 | 6292 | 1.3001 |
584
+ | 1.1254 | 485.0 | 6305 | 1.3180 |
585
+ | 1.1117 | 486.0 | 6318 | 1.3036 |
586
+ | 1.1249 | 487.0 | 6331 | 1.2380 |
587
+ | 1.1111 | 488.0 | 6344 | 1.3112 |
588
+ | 1.119 | 489.0 | 6357 | 1.2587 |
589
+ | 1.1203 | 490.0 | 6370 | 1.2867 |
590
+ | 1.1195 | 491.0 | 6383 | 1.3153 |
591
+ | 1.1304 | 492.0 | 6396 | 1.2762 |
592
+ | 1.1268 | 493.0 | 6409 | 1.2757 |
593
+ | 1.1478 | 494.0 | 6422 | 1.2493 |
594
+ | 1.1527 | 495.0 | 6435 | 1.2793 |
595
+ | 1.1252 | 496.0 | 6448 | 1.2435 |
596
+ | 1.1307 | 497.0 | 6461 | 1.3311 |
597
+ | 1.1163 | 498.0 | 6474 | 1.3016 |
598
+ | 1.099 | 499.0 | 6487 | 1.3532 |
599
+ | 1.1246 | 500.0 | 6500 | 1.2222 |
600
+
601
+
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nlpaueb/bert-base-greek-uncased-v1",
3
+ "architectures": [
4
+ "BertForMaskedLM"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 3072,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 12,
17
+ "num_hidden_layers": 12,
18
+ "output_past": true,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.31.0",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 35000
26
+ }
generation_config.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "pad_token_id": 0,
4
+ "transformers_version": "4.31.0"
5
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2d8ddb175035b5df4e7520c2dabccfd4937c3a00a7533b1482077381ac093e91
3
+ size 451900469
runs/Aug04_15-33-45_526b269448fb/events.out.tfevents.1691163235.526b269448fb.12888.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:35674a2bbb11aa91477697b49300c0d36aec6cb7c44b7bd26416fab5b86a8fd2
3
+ size 4350
runs/Aug04_15-36-29_526b269448fb/events.out.tfevents.1691163400.526b269448fb.12888.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b3e1367a58882baf9c7d3dea165e0c6e705379d4a8af2135fbb3de3c5b540d2b
3
+ size 297
runs/Aug04_15-36-48_526b269448fb/events.out.tfevents.1691163414.526b269448fb.12888.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4de6e731c0482780b7ce608740932480493b507a6865414a47a3b969ea5b33af
3
+ size 225066
runs/Aug04_15-36-48_526b269448fb/events.out.tfevents.1691175372.526b269448fb.12888.3 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:798ed1ed672d6711845da628482791b5ac1086488446a92787af85cacde03c7e
3
+ size 359
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "clean_up_tokenization_spaces": true,
3
+ "cls_token": "[CLS]",
4
+ "do_basic_tokenize": true,
5
+ "do_lower_case": true,
6
+ "mask_token": "[MASK]",
7
+ "model_max_length": 1000000000000000019884624838656,
8
+ "never_split": null,
9
+ "pad_token": "[PAD]",
10
+ "sep_token": "[SEP]",
11
+ "strip_accents": null,
12
+ "tokenize_chinese_chars": true,
13
+ "tokenizer_class": "BertTokenizer",
14
+ "unk_token": "[UNK]"
15
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e10a0177b1fb9dbdfae79f8fa1cbfe52222fbdcba3e01bf723b63da0e9748ca9
3
+ size 4027
vocab.txt ADDED
The diff for this file is too large to render. See raw diff