lilyyellow commited on
Commit
93970f5
1 Parent(s): 3c8802f

End of training

Browse files
Files changed (4) hide show
  1. README.md +23 -29
  2. config.json +50 -58
  3. model.safetensors +2 -2
  4. training_args.bin +1 -1
README.md CHANGED
@@ -14,25 +14,23 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [NlpHUST/ner-vietnamese-electra-base](https://huggingface.co/NlpHUST/ner-vietnamese-electra-base) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.5631
18
- - Age: {'precision': 0.8356164383561644, 'recall': 0.9242424242424242, 'f1': 0.8776978417266188, 'number': 132}
19
- - Datetime: {'precision': 0.7099903006789525, 'recall': 0.7439024390243902, 'f1': 0.7265508684863523, 'number': 984}
20
- - Disease: {'precision': 0.7104247104247104, 'recall': 0.6501766784452296, 'f1': 0.6789667896678967, 'number': 283}
21
- - Event: {'precision': 0.2966666666666667, 'recall': 0.3371212121212121, 'f1': 0.31560283687943264, 'number': 264}
22
- - Gender: {'precision': 0.775, 'recall': 0.8157894736842105, 'f1': 0.7948717948717949, 'number': 114}
23
- - Law: {'precision': 0.6129032258064516, 'recall': 0.6758893280632411, 'f1': 0.6428571428571429, 'number': 253}
24
- - Location: {'precision': 0.7188841201716738, 'recall': 0.7326407873154729, 'f1': 0.7256972650961276, 'number': 1829}
25
- - Organization: {'precision': 0.64702154626109, 'recall': 0.7256574271499645, 'f1': 0.6840871021775545, 'number': 1407}
26
- - Person: {'precision': 0.697508896797153, 'recall': 0.7346326836581709, 'f1': 0.7155896312522818, 'number': 1334}
27
- - Phone: {'precision': 0.8735632183908046, 'recall': 0.9743589743589743, 'f1': 0.9212121212121213, 'number': 78}
28
- - Product: {'precision': 0.43661971830985913, 'recall': 0.36328125, 'f1': 0.3965884861407249, 'number': 256}
29
- - Quantity: {'precision': 0.5562913907284768, 'recall': 0.6176470588235294, 'f1': 0.5853658536585366, 'number': 544}
30
- - Role: {'precision': 0.458477508650519, 'recall': 0.5105973025048169, 'f1': 0.4831358249772105, 'number': 519}
31
- - Transportation: {'precision': 0.49710982658959535, 'recall': 0.6231884057971014, 'f1': 0.5530546623794212, 'number': 138}
32
- - Overall Precision: 0.6470
33
- - Overall Recall: 0.6869
34
- - Overall F1: 0.6663
35
- - Overall Accuracy: 0.8887
36
 
37
  ## Model description
38
 
@@ -52,22 +50,18 @@ More information needed
52
 
53
  The following hyperparameters were used during training:
54
  - learning_rate: 5e-05
55
- - train_batch_size: 16
56
- - eval_batch_size: 16
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
  - lr_scheduler_type: linear
60
- - num_epochs: 10
61
 
62
  ### Training results
63
 
64
- | Training Loss | Epoch | Step | Validation Loss | Age | Datetime | Disease | Event | Gender | Law | Location | Organization | Person | Phone | Product | Quantity | Role | Transportation | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
65
- |:-------------:|:------:|:-----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
66
- | 0.2991 | 1.9991 | 2313 | 0.3382 | {'precision': 0.8266666666666667, 'recall': 0.9393939393939394, 'f1': 0.8794326241134751, 'number': 132} | {'precision': 0.7038095238095238, 'recall': 0.7510162601626016, 'f1': 0.726647000983284, 'number': 984} | {'precision': 0.7405857740585774, 'recall': 0.6254416961130742, 'f1': 0.6781609195402298, 'number': 283} | {'precision': 0.3445692883895131, 'recall': 0.3484848484848485, 'f1': 0.3465160075329567, 'number': 264} | {'precision': 0.752, 'recall': 0.8245614035087719, 'f1': 0.7866108786610879, 'number': 114} | {'precision': 0.5314465408805031, 'recall': 0.6679841897233202, 'f1': 0.5919439579684764, 'number': 253} | {'precision': 0.7042633567188343, 'recall': 0.7135046473482778, 'f1': 0.7088538837588269, 'number': 1829} | {'precision': 0.613166144200627, 'recall': 0.6950959488272921, 'f1': 0.6515656229180545, 'number': 1407} | {'precision': 0.7142857142857143, 'recall': 0.704647676161919, 'f1': 0.709433962264151, 'number': 1334} | {'precision': 0.8292682926829268, 'recall': 0.8717948717948718, 'f1': 0.8500000000000001, 'number': 78} | {'precision': 0.3772455089820359, 'recall': 0.24609375, 'f1': 0.2978723404255319, 'number': 256} | {'precision': 0.5856873822975518, 'recall': 0.5716911764705882, 'f1': 0.5786046511627907, 'number': 544} | {'precision': 0.45471349353049906, 'recall': 0.47398843930635837, 'f1': 0.4641509433962264, 'number': 519} | {'precision': 0.52046783625731, 'recall': 0.644927536231884, 'f1': 0.5760517799352751, 'number': 138} | 0.6419 | 0.6632 | 0.6524 | 0.8923 |
67
- | 0.2032 | 3.9983 | 4626 | 0.3906 | {'precision': 0.8741258741258742, 'recall': 0.946969696969697, 'f1': 0.9090909090909091, 'number': 132} | {'precision': 0.7048710601719198, 'recall': 0.75, 'f1': 0.7267355982274742, 'number': 984} | {'precision': 0.6943396226415094, 'recall': 0.6501766784452296, 'f1': 0.6715328467153284, 'number': 283} | {'precision': 0.3344370860927152, 'recall': 0.38257575757575757, 'f1': 0.3568904593639575, 'number': 264} | {'precision': 0.8214285714285714, 'recall': 0.8070175438596491, 'f1': 0.8141592920353982, 'number': 114} | {'precision': 0.547112462006079, 'recall': 0.7114624505928854, 'f1': 0.6185567010309279, 'number': 253} | {'precision': 0.7110286320254506, 'recall': 0.7331875341716785, 'f1': 0.7219380888290713, 'number': 1829} | {'precision': 0.623030303030303, 'recall': 0.7306325515280739, 'f1': 0.6725547922800131, 'number': 1407} | {'precision': 0.6707482993197279, 'recall': 0.7391304347826086, 'f1': 0.7032810271041369, 'number': 1334} | {'precision': 0.9156626506024096, 'recall': 0.9743589743589743, 'f1': 0.9440993788819876, 'number': 78} | {'precision': 0.38738738738738737, 'recall': 0.3359375, 'f1': 0.3598326359832636, 'number': 256} | {'precision': 0.5631399317406144, 'recall': 0.6066176470588235, 'f1': 0.584070796460177, 'number': 544} | {'precision': 0.43454545454545457, 'recall': 0.4605009633911368, 'f1': 0.44714686623012156, 'number': 519} | {'precision': 0.5153374233128835, 'recall': 0.6086956521739131, 'f1': 0.5581395348837209, 'number': 138} | 0.6347 | 0.6872 | 0.6599 | 0.8900 |
68
- | 0.138 | 5.9974 | 6939 | 0.4454 | {'precision': 0.8299319727891157, 'recall': 0.9242424242424242, 'f1': 0.8745519713261649, 'number': 132} | {'precision': 0.7272727272727273, 'recall': 0.7479674796747967, 'f1': 0.7374749498997997, 'number': 984} | {'precision': 0.7192307692307692, 'recall': 0.6607773851590106, 'f1': 0.6887661141804788, 'number': 283} | {'precision': 0.32113821138211385, 'recall': 0.29924242424242425, 'f1': 0.30980392156862746, 'number': 264} | {'precision': 0.8103448275862069, 'recall': 0.8245614035087719, 'f1': 0.8173913043478261, 'number': 114} | {'precision': 0.5620915032679739, 'recall': 0.6798418972332015, 'f1': 0.6153846153846154, 'number': 253} | {'precision': 0.7173678532901834, 'recall': 0.7271733187534172, 'f1': 0.7222373065435785, 'number': 1829} | {'precision': 0.6416717885679164, 'recall': 0.7420042643923241, 'f1': 0.6882003955174687, 'number': 1407} | {'precision': 0.678254942058623, 'recall': 0.7458770614692654, 'f1': 0.7104605498036416, 'number': 1334} | {'precision': 0.8444444444444444, 'recall': 0.9743589743589743, 'f1': 0.9047619047619048, 'number': 78} | {'precision': 0.40358744394618834, 'recall': 0.3515625, 'f1': 0.3757828810020877, 'number': 256} | {'precision': 0.5023041474654378, 'recall': 0.6011029411764706, 'f1': 0.5472803347280336, 'number': 544} | {'precision': 0.4894433781190019, 'recall': 0.4913294797687861, 'f1': 0.49038461538461536, 'number': 519} | {'precision': 0.49696969696969695, 'recall': 0.5942028985507246, 'f1': 0.5412541254125411, 'number': 138} | 0.6435 | 0.6870 | 0.6646 | 0.8883 |
69
- | 0.0864 | 7.9965 | 9252 | 0.5145 | {'precision': 0.8299319727891157, 'recall': 0.9242424242424242, 'f1': 0.8745519713261649, 'number': 132} | {'precision': 0.7032442748091603, 'recall': 0.7489837398373984, 'f1': 0.7253937007874016, 'number': 984} | {'precision': 0.7007575757575758, 'recall': 0.6537102473498233, 'f1': 0.676416819012797, 'number': 283} | {'precision': 0.3114754098360656, 'recall': 0.35984848484848486, 'f1': 0.3339191564147628, 'number': 264} | {'precision': 0.7833333333333333, 'recall': 0.8245614035087719, 'f1': 0.8034188034188033, 'number': 114} | {'precision': 0.5866666666666667, 'recall': 0.6956521739130435, 'f1': 0.6365280289330922, 'number': 253} | {'precision': 0.7332242225859247, 'recall': 0.7348277747402953, 'f1': 0.7340251228836702, 'number': 1829} | {'precision': 0.6387176325524044, 'recall': 0.736318407960199, 'f1': 0.6840541432816111, 'number': 1407} | {'precision': 0.686013986013986, 'recall': 0.7353823088455772, 'f1': 0.7098408104196816, 'number': 1334} | {'precision': 0.8837209302325582, 'recall': 0.9743589743589743, 'f1': 0.9268292682926831, 'number': 78} | {'precision': 0.4186991869918699, 'recall': 0.40234375, 'f1': 0.41035856573705176, 'number': 256} | {'precision': 0.5604026845637584, 'recall': 0.6139705882352942, 'f1': 0.5859649122807018, 'number': 544} | {'precision': 0.4489112227805695, 'recall': 0.5163776493256262, 'f1': 0.48028673835125446, 'number': 519} | {'precision': 0.47752808988764045, 'recall': 0.6159420289855072, 'f1': 0.5379746835443038, 'number': 138} | 0.6425 | 0.6928 | 0.6667 | 0.8880 |
70
- | 0.0665 | 9.9957 | 11565 | 0.5631 | {'precision': 0.8356164383561644, 'recall': 0.9242424242424242, 'f1': 0.8776978417266188, 'number': 132} | {'precision': 0.7099903006789525, 'recall': 0.7439024390243902, 'f1': 0.7265508684863523, 'number': 984} | {'precision': 0.7104247104247104, 'recall': 0.6501766784452296, 'f1': 0.6789667896678967, 'number': 283} | {'precision': 0.2966666666666667, 'recall': 0.3371212121212121, 'f1': 0.31560283687943264, 'number': 264} | {'precision': 0.775, 'recall': 0.8157894736842105, 'f1': 0.7948717948717949, 'number': 114} | {'precision': 0.6129032258064516, 'recall': 0.6758893280632411, 'f1': 0.6428571428571429, 'number': 253} | {'precision': 0.7188841201716738, 'recall': 0.7326407873154729, 'f1': 0.7256972650961276, 'number': 1829} | {'precision': 0.64702154626109, 'recall': 0.7256574271499645, 'f1': 0.6840871021775545, 'number': 1407} | {'precision': 0.697508896797153, 'recall': 0.7346326836581709, 'f1': 0.7155896312522818, 'number': 1334} | {'precision': 0.8735632183908046, 'recall': 0.9743589743589743, 'f1': 0.9212121212121213, 'number': 78} | {'precision': 0.43661971830985913, 'recall': 0.36328125, 'f1': 0.3965884861407249, 'number': 256} | {'precision': 0.5562913907284768, 'recall': 0.6176470588235294, 'f1': 0.5853658536585366, 'number': 544} | {'precision': 0.458477508650519, 'recall': 0.5105973025048169, 'f1': 0.4831358249772105, 'number': 519} | {'precision': 0.49710982658959535, 'recall': 0.6231884057971014, 'f1': 0.5530546623794212, 'number': 138} | 0.6470 | 0.6869 | 0.6663 | 0.8887 |
71
 
72
 
73
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [NlpHUST/ner-vietnamese-electra-base](https://huggingface.co/NlpHUST/ner-vietnamese-electra-base) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.3274
18
+ - Age: {'precision': 0.9069767441860465, 'recall': 0.8731343283582089, 'f1': 0.8897338403041826, 'number': 134}
19
+ - Datetime: {'precision': 0.6740947075208914, 'recall': 0.7355623100303952, 'f1': 0.7034883720930233, 'number': 987}
20
+ - Disease: {'precision': 0.6631944444444444, 'recall': 0.7290076335877863, 'f1': 0.6945454545454546, 'number': 262}
21
+ - Event: {'precision': 0.290625, 'recall': 0.33214285714285713, 'f1': 0.31, 'number': 280}
22
+ - Gender: {'precision': 0.8266666666666667, 'recall': 0.7126436781609196, 'f1': 0.7654320987654321, 'number': 87}
23
+ - Law: {'precision': 0.5854430379746836, 'recall': 0.7254901960784313, 'f1': 0.647985989492119, 'number': 255}
24
+ - Location: {'precision': 0.6700662927078022, 'recall': 0.732033426183844, 'f1': 0.6996805111821087, 'number': 1795}
25
+ - Organization: {'precision': 0.5947934352009054, 'recall': 0.6946463978849967, 'f1': 0.6408536585365853, 'number': 1513}
26
+ - Person: {'precision': 0.6908841672378341, 'recall': 0.7251798561151079, 'f1': 0.7076167076167076, 'number': 1390}
27
+ - Quantity: {'precision': 0.5075528700906344, 'recall': 0.5936395759717314, 'f1': 0.5472312703583061, 'number': 566}
28
+ - Role: {'precision': 0.465818759936407, 'recall': 0.5356489945155393, 'f1': 0.49829931972789115, 'number': 547}
29
+ - Transportation: {'precision': 0.46153846153846156, 'recall': 0.5217391304347826, 'f1': 0.4897959183673469, 'number': 115}
30
+ - Overall Precision: 0.6168
31
+ - Overall Recall: 0.6854
32
+ - Overall F1: 0.6493
33
+ - Overall Accuracy: 0.8998
 
 
34
 
35
  ## Model description
36
 
 
50
 
51
  The following hyperparameters were used during training:
52
  - learning_rate: 5e-05
53
+ - train_batch_size: 32
54
+ - eval_batch_size: 32
55
  - seed: 42
56
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
57
  - lr_scheduler_type: linear
58
+ - num_epochs: 2
59
 
60
  ### Training results
61
 
62
+ | Training Loss | Epoch | Step | Validation Loss | Age | Datetime | Disease | Event | Gender | Law | Location | Organization | Person | Quantity | Role | Transportation | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
63
+ |:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
64
+ | 0.2854 | 1.9965 | 1156 | 0.3274 | {'precision': 0.9069767441860465, 'recall': 0.8731343283582089, 'f1': 0.8897338403041826, 'number': 134} | {'precision': 0.6740947075208914, 'recall': 0.7355623100303952, 'f1': 0.7034883720930233, 'number': 987} | {'precision': 0.6631944444444444, 'recall': 0.7290076335877863, 'f1': 0.6945454545454546, 'number': 262} | {'precision': 0.290625, 'recall': 0.33214285714285713, 'f1': 0.31, 'number': 280} | {'precision': 0.8266666666666667, 'recall': 0.7126436781609196, 'f1': 0.7654320987654321, 'number': 87} | {'precision': 0.5854430379746836, 'recall': 0.7254901960784313, 'f1': 0.647985989492119, 'number': 255} | {'precision': 0.6700662927078022, 'recall': 0.732033426183844, 'f1': 0.6996805111821087, 'number': 1795} | {'precision': 0.5947934352009054, 'recall': 0.6946463978849967, 'f1': 0.6408536585365853, 'number': 1513} | {'precision': 0.6908841672378341, 'recall': 0.7251798561151079, 'f1': 0.7076167076167076, 'number': 1390} | {'precision': 0.5075528700906344, 'recall': 0.5936395759717314, 'f1': 0.5472312703583061, 'number': 566} | {'precision': 0.465818759936407, 'recall': 0.5356489945155393, 'f1': 0.49829931972789115, 'number': 547} | {'precision': 0.46153846153846156, 'recall': 0.5217391304347826, 'f1': 0.4897959183673469, 'number': 115} | 0.6168 | 0.6854 | 0.6493 | 0.8998 |
 
 
 
 
65
 
66
 
67
  ### Framework versions
config.json CHANGED
@@ -11,68 +11,60 @@
11
  "hidden_dropout_prob": 0.1,
12
  "hidden_size": 768,
13
  "id2label": {
14
- "0": "I-DATETIME",
15
- "1": "B-PERSON",
16
- "2": "I-AGE",
17
- "3": "O",
18
- "4": "B-ORGANIZATION",
19
- "5": "B-ROLE",
20
- "6": "B-DATETIME",
21
- "7": "I-PERSON",
22
- "8": "B-AGE",
23
- "9": "B-QUANTITY",
24
- "10": "B-LAW",
25
- "11": "I-GENDER",
26
- "12": "I-DISEASE",
27
- "13": "B-PRODUCT",
28
- "14": "I-ROLE",
29
- "15": "I-LAW",
30
- "16": "B-GENDER",
31
- "17": "B-LOCATION",
32
- "18": "I-TRANSPORTATION",
33
- "19": "I-PRODUCT",
34
- "20": "I-LOCATION",
35
- "21": "B-EVENT",
36
- "22": "B-PHONE",
37
- "23": "B-DISEASE",
38
- "24": "I-QUANTITY",
39
- "25": "I-PHONE",
40
- "26": "I-EVENT",
41
- "27": "B-TRANSPORTATION",
42
- "28": "I-ORGANIZATION"
43
  },
44
  "initializer_range": 0.02,
45
  "intermediate_size": 3072,
46
  "label2id": {
47
- "B-AGE": 8,
48
- "B-DATETIME": 6,
49
- "B-DISEASE": 23,
50
- "B-EVENT": 21,
51
- "B-GENDER": 16,
52
- "B-LAW": 10,
53
- "B-LOCATION": 17,
54
- "B-ORGANIZATION": 4,
55
- "B-PERSON": 1,
56
- "B-PHONE": 22,
57
- "B-PRODUCT": 13,
58
- "B-QUANTITY": 9,
59
- "B-ROLE": 5,
60
- "B-TRANSPORTATION": 27,
61
- "I-AGE": 2,
62
- "I-DATETIME": 0,
63
- "I-DISEASE": 12,
64
- "I-EVENT": 26,
65
- "I-GENDER": 11,
66
- "I-LAW": 15,
67
- "I-LOCATION": 20,
68
- "I-ORGANIZATION": 28,
69
- "I-PERSON": 7,
70
- "I-PHONE": 25,
71
- "I-PRODUCT": 19,
72
- "I-QUANTITY": 24,
73
- "I-ROLE": 14,
74
- "I-TRANSPORTATION": 18,
75
- "O": 3
76
  },
77
  "layer_norm_eps": 1e-12,
78
  "max_position_embeddings": 512,
 
11
  "hidden_dropout_prob": 0.1,
12
  "hidden_size": 768,
13
  "id2label": {
14
+ "0": "I-GENDER",
15
+ "1": "B-LAW",
16
+ "2": "B-QUANTITY",
17
+ "3": "B-PERSON",
18
+ "4": "B-AGE",
19
+ "5": "I-QUANTITY",
20
+ "6": "I-LOCATION",
21
+ "7": "I-DATETIME",
22
+ "8": "I-ROLE",
23
+ "9": "I-DISEASE",
24
+ "10": "I-LAW",
25
+ "11": "B-GENDER",
26
+ "12": "B-EVENT",
27
+ "13": "O",
28
+ "14": "I-AGE",
29
+ "15": "B-LOCATION",
30
+ "16": "B-DISEASE",
31
+ "17": "B-ORGANIZATION",
32
+ "18": "I-ORGANIZATION",
33
+ "19": "B-TRANSPORTATION",
34
+ "20": "I-EVENT",
35
+ "21": "I-TRANSPORTATION",
36
+ "22": "B-ROLE",
37
+ "23": "I-PERSON",
38
+ "24": "B-DATETIME"
 
 
 
 
39
  },
40
  "initializer_range": 0.02,
41
  "intermediate_size": 3072,
42
  "label2id": {
43
+ "B-AGE": 4,
44
+ "B-DATETIME": 24,
45
+ "B-DISEASE": 16,
46
+ "B-EVENT": 12,
47
+ "B-GENDER": 11,
48
+ "B-LAW": 1,
49
+ "B-LOCATION": 15,
50
+ "B-ORGANIZATION": 17,
51
+ "B-PERSON": 3,
52
+ "B-QUANTITY": 2,
53
+ "B-ROLE": 22,
54
+ "B-TRANSPORTATION": 19,
55
+ "I-AGE": 14,
56
+ "I-DATETIME": 7,
57
+ "I-DISEASE": 9,
58
+ "I-EVENT": 20,
59
+ "I-GENDER": 0,
60
+ "I-LAW": 10,
61
+ "I-LOCATION": 6,
62
+ "I-ORGANIZATION": 18,
63
+ "I-PERSON": 23,
64
+ "I-QUANTITY": 5,
65
+ "I-ROLE": 8,
66
+ "I-TRANSPORTATION": 21,
67
+ "O": 13
 
 
 
 
68
  },
69
  "layer_norm_eps": 1e-12,
70
  "max_position_embeddings": 512,
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:48032d19ac5be0a9c37d60ef83db995fe8d21a13b70566e4187dbcec79e89c40
3
- size 532380148
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7b26586038bd2b6370113a5f2d06a12b8eb35624fe4e929cfae92d504da98387
3
+ size 532367844
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3f3cc65b78dbb04788aa8f1ec4902b0bfc2588c5688921b40ad40e93de760604
3
  size 5112
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae934c411f42ed098a94aabeb09d3540809d82ecba967735ddf10cefe14a36d7
3
  size 5112