dl-ru commited on
Commit
bb40463
1 Parent(s): cb7c402

New version with explicit predicate marking

Browse files
Files changed (5) hide show
  1. README.md +68 -59
  2. config.json +42 -44
  3. pytorch_model.bin +2 -2
  4. tokenizer.json +1 -6
  5. training_args.bin +2 -2
README.md CHANGED
@@ -15,59 +15,67 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [ai-forever/ruElectra-medium](https://huggingface.co/ai-forever/ruElectra-medium) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.1367
19
- - Addressee Precision: 0.8793
20
- - Addressee Recall: 0.8947
21
- - Addressee F1: 0.8870
22
- - Addressee Number: 57
23
- - Benefactive Precision: 0.6
24
- - Benefactive Recall: 0.3
25
- - Benefactive F1: 0.4
26
- - Benefactive Number: 10
27
- - Causator Precision: 0.9296
28
- - Causator Recall: 0.8049
29
- - Causator F1: 0.8627
30
- - Causator Number: 82
31
- - Cause Precision: 0.5618
32
  - Cause Recall: 0.7353
33
- - Cause F1: 0.6369
34
- - Cause Number: 68
35
- - Contrsubject Precision: 0.8409
36
- - Contrsubject Recall: 0.925
37
- - Contrsubject F1: 0.8810
38
- - Contrsubject Number: 120
39
- - Deliberative Precision: 0.9074
40
- - Deliberative Recall: 0.9423
41
- - Deliberative F1: 0.9245
42
- - Deliberative Number: 52
43
- - Destinative Precision: 0.9130
44
- - Destinative Recall: 0.875
45
- - Destinative F1: 0.8936
46
- - Destinative Number: 24
47
- - Directivefinal Precision: 0.6154
48
  - Directivefinal Recall: 0.6667
49
- - Directivefinal F1: 0.64
50
- - Directivefinal Number: 12
51
- - Experiencer Precision: 0.8525
52
- - Experiencer Recall: 0.8660
53
- - Experiencer F1: 0.8592
54
- - Experiencer Number: 694
55
- - Instrument Precision: 1.0
56
- - Instrument Recall: 0.1111
57
- - Instrument F1: 0.2000
58
  - Instrument Number: 9
59
- - Mediative Precision: 0.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
60
  - Mediative Recall: 0.0
 
 
61
  - Mediative F1: 0.0
62
- - Mediative Number: 1
63
- - Object Precision: 0.8735
64
- - Object Recall: 0.8924
65
- - Object F1: 0.8828
66
- - Object Number: 1524
67
- - Overall Precision: 0.8571
68
- - Overall Recall: 0.8749
69
- - Overall F1: 0.8659
70
- - Overall Accuracy: 0.9711
71
 
72
  ## Model description
73
 
@@ -86,25 +94,26 @@ More information needed
86
  ### Training hyperparameters
87
 
88
  The following hyperparameters were used during training:
89
- - learning_rate: 9.81632502988664e-05
90
- - train_batch_size: 4
91
  - eval_batch_size: 1
92
- - seed: 605573
93
- - gradient_accumulation_steps: 2
94
  - total_train_batch_size: 8
95
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
96
  - lr_scheduler_type: linear
 
97
  - num_epochs: 5
98
 
99
  ### Training results
100
 
101
- | Training Loss | Epoch | Step | Validation Loss | Addressee Precision | Addressee Recall | Addressee F1 | Addressee Number | Benefactive Precision | Benefactive Recall | Benefactive F1 | Benefactive Number | Causator Precision | Causator Recall | Causator F1 | Causator Number | Cause Precision | Cause Recall | Cause F1 | Cause Number | Contrsubject Precision | Contrsubject Recall | Contrsubject F1 | Contrsubject Number | Deliberative Precision | Deliberative Recall | Deliberative F1 | Deliberative Number | Destinative Precision | Destinative Recall | Destinative F1 | Destinative Number | Directivefinal Precision | Directivefinal Recall | Directivefinal F1 | Directivefinal Number | Experiencer Precision | Experiencer Recall | Experiencer F1 | Experiencer Number | Instrument Precision | Instrument Recall | Instrument F1 | Instrument Number | Mediative Precision | Mediative Recall | Mediative F1 | Mediative Number | Object Precision | Object Recall | Object F1 | Object Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
102
- |:-------------:|:-----:|:----:|:---------------:|:-------------------:|:----------------:|:------------:|:----------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:------------------:|:---------------:|:-----------:|:---------------:|:---------------:|:------------:|:--------:|:------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:------------------------:|:---------------------:|:-----------------:|:---------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:-------------------:|:----------------:|:------------:|:----------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:|
103
- | 0.1821 | 1.0 | 724 | 0.1479 | 0.5761 | 0.9298 | 0.7114 | 57 | 0.0 | 0.0 | 0.0 | 10 | 0.6867 | 0.6951 | 0.6909 | 82 | 0.72 | 0.2647 | 0.3871 | 68 | 0.8171 | 0.5583 | 0.6634 | 120 | 0.5111 | 0.4423 | 0.4742 | 52 | 0.0 | 0.0 | 0.0 | 24 | 0.0 | 0.0 | 0.0 | 12 | 0.8496 | 0.8141 | 0.8315 | 694 | 0.0 | 0.0 | 0.0 | 9 | 0.0 | 0.0 | 0.0 | 1 | 0.8183 | 0.8688 | 0.8428 | 1524 | 0.8073 | 0.7942 | 0.8007 | 0.9619 |
104
- | 0.0938 | 2.0 | 1448 | 0.1384 | 0.6714 | 0.8246 | 0.7402 | 57 | 0.0 | 0.0 | 0.0 | 10 | 0.8649 | 0.7805 | 0.8205 | 82 | 0.5067 | 0.5588 | 0.5315 | 68 | 0.7329 | 0.8917 | 0.8045 | 120 | 0.5465 | 0.9038 | 0.6812 | 52 | 0.0 | 0.0 | 0.0 | 24 | 0.5556 | 0.4167 | 0.4762 | 12 | 0.7835 | 0.9179 | 0.8454 | 694 | 0.0 | 0.0 | 0.0 | 9 | 0.0 | 0.0 | 0.0 | 1 | 0.8329 | 0.8832 | 0.8573 | 1524 | 0.7930 | 0.8636 | 0.8268 | 0.9635 |
105
- | 0.0627 | 3.0 | 2172 | 0.1194 | 0.8125 | 0.9123 | 0.8595 | 57 | 0.25 | 0.2 | 0.2222 | 10 | 0.9178 | 0.8171 | 0.8645 | 82 | 0.5 | 0.6176 | 0.5526 | 68 | 0.7343 | 0.875 | 0.7985 | 120 | 0.8980 | 0.8462 | 0.8713 | 52 | 0.8421 | 0.6667 | 0.7442 | 24 | 0.7273 | 0.6667 | 0.6957 | 12 | 0.8815 | 0.8357 | 0.8580 | 694 | 0.0 | 0.0 | 0.0 | 9 | 0.0 | 0.0 | 0.0 | 1 | 0.8579 | 0.8871 | 0.8723 | 1524 | 0.8447 | 0.8549 | 0.8498 | 0.9687 |
106
- | 0.0501 | 4.0 | 2896 | 0.1276 | 0.8772 | 0.8772 | 0.8772 | 57 | 0.6667 | 0.4 | 0.5 | 10 | 0.9242 | 0.7439 | 0.8243 | 82 | 0.5604 | 0.75 | 0.6415 | 68 | 0.8409 | 0.925 | 0.8810 | 120 | 0.9245 | 0.9423 | 0.9333 | 52 | 0.9130 | 0.875 | 0.8936 | 24 | 0.6154 | 0.6667 | 0.64 | 12 | 0.8693 | 0.8530 | 0.8611 | 694 | 0.0 | 0.0 | 0.0 | 9 | 0.0 | 0.0 | 0.0 | 1 | 0.8773 | 0.8865 | 0.8819 | 1524 | 0.8633 | 0.8662 | 0.8647 | 0.9713 |
107
- | 0.0205 | 5.0 | 3620 | 0.1367 | 0.8793 | 0.8947 | 0.8870 | 57 | 0.6 | 0.3 | 0.4 | 10 | 0.9296 | 0.8049 | 0.8627 | 82 | 0.5618 | 0.7353 | 0.6369 | 68 | 0.8409 | 0.925 | 0.8810 | 120 | 0.9074 | 0.9423 | 0.9245 | 52 | 0.9130 | 0.875 | 0.8936 | 24 | 0.6154 | 0.6667 | 0.64 | 12 | 0.8525 | 0.8660 | 0.8592 | 694 | 1.0 | 0.1111 | 0.2000 | 9 | 0.0 | 0.0 | 0.0 | 1 | 0.8735 | 0.8924 | 0.8828 | 1524 | 0.8571 | 0.8749 | 0.8659 | 0.9711 |
108
 
109
 
110
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [ai-forever/ruElectra-medium](https://huggingface.co/ai-forever/ruElectra-medium) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.1471
19
+ - Addressee Precision: 0.9583
20
+ - Addressee Recall: 0.9020
21
+ - Addressee F1: 0.9293
22
+ - Addressee Number: 51
23
+ - Benefactive Precision: 0.8
24
+ - Benefactive Recall: 0.25
25
+ - Benefactive F1: 0.3810
26
+ - Benefactive Number: 16
27
+ - Causator Precision: 0.8971
28
+ - Causator Recall: 0.8714
29
+ - Causator F1: 0.8841
30
+ - Causator Number: 70
31
+ - Cause Precision: 0.6466
32
  - Cause Recall: 0.7353
33
+ - Cause F1: 0.6881
34
+ - Cause Number: 102
35
+ - Contrsubject Precision: 0.832
36
+ - Contrsubject Recall: 0.7879
37
+ - Contrsubject F1: 0.8093
38
+ - Contrsubject Number: 132
39
+ - Deliberative Precision: 0.6269
40
+ - Deliberative Recall: 0.84
41
+ - Deliberative F1: 0.7179
42
+ - Deliberative Number: 50
43
+ - Destinative Precision: 1.0
44
+ - Destinative Recall: 0.3871
45
+ - Destinative F1: 0.5581
46
+ - Destinative Number: 31
47
+ - Directivefinal Precision: 0.5455
48
  - Directivefinal Recall: 0.6667
49
+ - Directivefinal F1: 0.6
50
+ - Directivefinal Number: 9
51
+ - Experiencer Precision: 0.8669
52
+ - Experiencer Recall: 0.8609
53
+ - Experiencer F1: 0.8639
54
+ - Experiencer Number: 726
55
+ - Instrument Precision: 0.5
56
+ - Instrument Recall: 0.3333
57
+ - Instrument F1: 0.4
58
  - Instrument Number: 9
59
+ - Limitative Precision: 0.0
60
+ - Limitative Recall: 0.0
61
+ - Limitative F1: 0.0
62
+ - Limitative Number: 4
63
+ - Object Precision: 0.8676
64
+ - Object Recall: 0.8703
65
+ - Object F1: 0.8689
66
+ - Object Number: 1611
67
+ - Overall Precision: 0.8515
68
+ - Overall Recall: 0.8467
69
+ - Overall F1: 0.8491
70
+ - Overall Accuracy: 0.9687
71
+ - Directiveinitial Recall: 0.0
72
+ - Directiveinitial Number: 0.0
73
+ - Directiveinitial Precision: 0.0
74
+ - Directiveinitial F1: 0.0
75
  - Mediative Recall: 0.0
76
+ - Mediative Number: 0.0
77
+ - Mediative Precision: 0.0
78
  - Mediative F1: 0.0
 
 
 
 
 
 
 
 
 
79
 
80
  ## Model description
81
 
 
94
  ### Training hyperparameters
95
 
96
  The following hyperparameters were used during training:
97
+ - learning_rate: 0.000261433658985083
98
+ - train_batch_size: 1
99
  - eval_batch_size: 1
100
+ - seed: 510754
101
+ - gradient_accumulation_steps: 8
102
  - total_train_batch_size: 8
103
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
104
  - lr_scheduler_type: linear
105
+ - lr_scheduler_warmup_ratio: 0.3
106
  - num_epochs: 5
107
 
108
  ### Training results
109
 
110
+ | Training Loss | Epoch | Step | Validation Loss | Addressee Precision | Addressee Recall | Addressee F1 | Addressee Number | Benefactive Precision | Benefactive Recall | Benefactive F1 | Benefactive Number | Causator Precision | Causator Recall | Causator F1 | Causator Number | Cause Precision | Cause Recall | Cause F1 | Cause Number | Contrsubject Precision | Contrsubject Recall | Contrsubject F1 | Contrsubject Number | Deliberative Precision | Deliberative Recall | Deliberative F1 | Deliberative Number | Destinative Precision | Destinative Recall | Destinative F1 | Destinative Number | Directivefinal Precision | Directivefinal Recall | Directivefinal F1 | Directivefinal Number | Experiencer Precision | Experiencer Recall | Experiencer F1 | Experiencer Number | Instrument Precision | Instrument Recall | Instrument F1 | Instrument Number | Limitative Precision | Limitative Recall | Limitative F1 | Limitative Number | Object Precision | Object Recall | Object F1 | Object Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Directiveinitial Recall | Directiveinitial Number | Directiveinitial Precision | Directiveinitial F1 | Mediative Recall | Mediative Number | Mediative Precision | Mediative F1 |
111
+ |:-------------:|:-----:|:----:|:---------------:|:-------------------:|:----------------:|:------------:|:----------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:------------------:|:---------------:|:-----------:|:---------------:|:---------------:|:------------:|:--------:|:------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:------------------------:|:---------------------:|:-----------------:|:---------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:|:-----------------------:|:-----------------------:|:--------------------------:|:-------------------:|:----------------:|:----------------:|:-------------------:|:------------:|
112
+ | 0.2154 | 1.0 | 763 | 0.2074 | 0.6842 | 0.5098 | 0.5843 | 51 | 0.0 | 0.0 | 0.0 | 16 | 0.1946 | 0.8286 | 0.3152 | 70 | 1.0 | 0.0098 | 0.0194 | 102 | 0.2 | 0.0076 | 0.0146 | 132 | 0.0 | 0.0 | 0.0 | 50 | 0.0 | 0.0 | 0.0 | 31 | 0.0 | 0.0 | 0.0 | 9 | 0.6747 | 0.7713 | 0.7198 | 726 | 0.0 | 0.0 | 0.0 | 9 | 0.0 | 0.0 | 0.0 | 4 | 0.8199 | 0.7263 | 0.7702 | 1611 | 0.6987 | 0.6460 | 0.6713 | 0.9433 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
113
+ | 0.2294 | 2.0 | 1526 | 0.2028 | 0.7460 | 0.9216 | 0.8246 | 51 | 0.0 | 0.0 | 0.0 | 16 | 0.0 | 0.0 | 0.0 | 70 | 0.3333 | 0.0098 | 0.0190 | 102 | 0.7791 | 0.5076 | 0.6147 | 132 | 0.22 | 0.88 | 0.352 | 50 | 0.0 | 0.0 | 0.0 | 31 | 0.6667 | 0.6667 | 0.6667 | 9 | 0.8822 | 0.6708 | 0.7621 | 726 | 0.0 | 0.0 | 0.0 | 9 | 0.0 | 0.0 | 0.0 | 4 | 0.7332 | 0.7914 | 0.7612 | 1611 | 0.7255 | 0.6855 | 0.7050 | 0.9417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
114
+ | 0.132 | 3.0 | 2290 | 0.1485 | 0.7188 | 0.9020 | 0.8 | 51 | 0.0 | 0.0 | 0.0 | 16 | 0.6854 | 0.8714 | 0.7673 | 70 | 0.4079 | 0.3039 | 0.3483 | 102 | 0.6562 | 0.7955 | 0.7192 | 132 | 0.5263 | 0.4 | 0.4545 | 50 | 0.0 | 0.0 | 0.0 | 31 | 0.6 | 0.6667 | 0.6316 | 9 | 0.8289 | 0.8609 | 0.8446 | 726 | 0.0 | 0.0 | 0.0 | 9 | 0.0 | 0.0 | 0.0 | 4 | 0.8013 | 0.8610 | 0.8300 | 1611 | 0.7806 | 0.8115 | 0.7957 | 0.9574 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
115
+ | 0.0748 | 4.0 | 3053 | 0.1382 | 0.9038 | 0.9216 | 0.9126 | 51 | 0.1905 | 0.25 | 0.2162 | 16 | 0.9104 | 0.8714 | 0.8905 | 70 | 0.5859 | 0.7353 | 0.6522 | 102 | 0.825 | 0.75 | 0.7857 | 132 | 0.4875 | 0.78 | 0.6 | 50 | 0.0 | 0.0 | 0.0 | 31 | 0.4615 | 0.6667 | 0.5455 | 9 | 0.9033 | 0.8237 | 0.8617 | 726 | 0.4 | 0.2222 | 0.2857 | 9 | 0.0 | 0.0 | 0.0 | 4 | 0.8468 | 0.8678 | 0.8571 | 1611 | 0.8321 | 0.8285 | 0.8303 | 0.9659 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
116
+ | 0.0504 | 5.0 | 3815 | 0.1471 | 0.9583 | 0.9020 | 0.9293 | 51 | 0.8 | 0.25 | 0.3810 | 16 | 0.8971 | 0.8714 | 0.8841 | 70 | 0.6466 | 0.7353 | 0.6881 | 102 | 0.832 | 0.7879 | 0.8093 | 132 | 0.6269 | 0.84 | 0.7179 | 50 | 1.0 | 0.3871 | 0.5581 | 31 | 0.5455 | 0.6667 | 0.6 | 9 | 0.8669 | 0.8609 | 0.8639 | 726 | 0.5 | 0.3333 | 0.4 | 9 | 0.0 | 0.0 | 0.0 | 4 | 0.8676 | 0.8703 | 0.8689 | 1611 | 0.8515 | 0.8467 | 0.8491 | 0.9687 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
117
 
118
 
119
  ### Framework versions
config.json CHANGED
@@ -12,54 +12,52 @@
12
  "hidden_size": 576,
13
  "id2label": {
14
  "0": "O",
15
- "1": "B-Predicate",
16
- "2": "B-Object",
17
- "3": "B-Experiencer",
18
- "4": "B-Cause",
19
- "5": "B-Deliberative",
20
- "6": "B-Causator",
21
- "7": "B-ContrSubject",
22
- "8": "B-Benefactive",
23
- "9": "B-Addressee",
24
- "10": "I-Object",
25
- "11": "B-Destinative",
26
- "12": "I-ContrSubject",
27
- "13": "B-Instrument",
28
- "14": "I-Deliberative",
29
- "15": "B-Limitative",
30
- "16": "B-DirectiveFinal",
31
- "17": "B-Mediative",
32
- "18": "I-DirectiveFinal",
33
- "19": "B-DirectiveInitial",
34
- "20": "I-DirectiveInitial",
35
- "21": "I-Experiencer",
36
- "22": "I-Cause"
37
  },
38
  "initializer_range": 0.02,
39
  "intermediate_size": 2304,
40
  "label2id": {
41
- "B-Addressee": 9,
42
- "B-Benefactive": 8,
43
- "B-Causator": 6,
44
- "B-Cause": 4,
45
- "B-ContrSubject": 7,
46
- "B-Deliberative": 5,
47
- "B-Destinative": 11,
48
- "B-DirectiveFinal": 16,
49
- "B-DirectiveInitial": 19,
50
- "B-Experiencer": 3,
51
- "B-Instrument": 13,
52
- "B-Limitative": 15,
53
- "B-Mediative": 17,
54
- "B-Object": 2,
55
- "B-Predicate": 1,
56
- "I-Cause": 22,
57
- "I-ContrSubject": 12,
58
- "I-Deliberative": 14,
59
- "I-DirectiveFinal": 18,
60
- "I-DirectiveInitial": 20,
61
- "I-Experiencer": 21,
62
- "I-Object": 10,
63
  "O": 0
64
  },
65
  "layer_norm_eps": 1e-12,
 
12
  "hidden_size": 576,
13
  "id2label": {
14
  "0": "O",
15
+ "1": "B-Object",
16
+ "2": "B-Experiencer",
17
+ "3": "B-Cause",
18
+ "4": "B-Deliberative",
19
+ "5": "B-Causator",
20
+ "6": "B-ContrSubject",
21
+ "7": "B-Benefactive",
22
+ "8": "B-Addressee",
23
+ "9": "I-Object",
24
+ "10": "B-Destinative",
25
+ "11": "I-ContrSubject",
26
+ "12": "B-Instrument",
27
+ "13": "I-Deliberative",
28
+ "14": "B-Limitative",
29
+ "15": "B-DirectiveFinal",
30
+ "16": "B-Mediative",
31
+ "17": "I-DirectiveFinal",
32
+ "18": "B-DirectiveInitial",
33
+ "19": "I-DirectiveInitial",
34
+ "20": "I-Experiencer",
35
+ "21": "I-Cause"
 
36
  },
37
  "initializer_range": 0.02,
38
  "intermediate_size": 2304,
39
  "label2id": {
40
+ "B-Addressee": 8,
41
+ "B-Benefactive": 7,
42
+ "B-Causator": 5,
43
+ "B-Cause": 3,
44
+ "B-ContrSubject": 6,
45
+ "B-Deliberative": 4,
46
+ "B-Destinative": 10,
47
+ "B-DirectiveFinal": 15,
48
+ "B-DirectiveInitial": 18,
49
+ "B-Experiencer": 2,
50
+ "B-Instrument": 12,
51
+ "B-Limitative": 14,
52
+ "B-Mediative": 16,
53
+ "B-Object": 1,
54
+ "I-Cause": 21,
55
+ "I-ContrSubject": 11,
56
+ "I-Deliberative": 13,
57
+ "I-DirectiveFinal": 17,
58
+ "I-DirectiveInitial": 19,
59
+ "I-Experiencer": 20,
60
+ "I-Object": 9,
 
61
  "O": 0
62
  },
63
  "layer_norm_eps": 1e-12,
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:49608e38da88f2cc7a74e91f5dfeeda4ac34f55947329da8a6b5521ca0cde33d
3
- size 340228649
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1b8ef50ce5f1294afd3d610e06e06f236b2f231750523ba0c335cb1269c621e1
3
+ size 340226345
tokenizer.json CHANGED
@@ -1,11 +1,6 @@
1
  {
2
  "version": "1.0",
3
- "truncation": {
4
- "direction": "Right",
5
- "max_length": 2048,
6
- "strategy": "LongestFirst",
7
- "stride": 0
8
- },
9
  "padding": null,
10
  "added_tokens": [
11
  {
 
1
  {
2
  "version": "1.0",
3
+ "truncation": null,
 
 
 
 
 
4
  "padding": null,
5
  "added_tokens": [
6
  {
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:cfdd72cd5d712ce859220138bff2efdb41b9156c57fc69e5ea01c3ae5b094122
3
- size 4155
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d6b76fd56499805942ca588f2f290c4c0a3e7c80b80ef2c2b659e065090c0acb
3
+ size 4091