lilyyellow commited on
Commit
25916df
1 Parent(s): 302933b

End of training

Browse files
Files changed (1) hide show
  1. README.md +29 -26
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- base_model: NlpHUST/ner-vietnamese-electra-base
3
  tags:
4
  - generated_from_trainer
5
  model-index:
@@ -12,27 +12,27 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # my_awesome_ner-token_classification_v1.0.7-5
14
 
15
- This model is a fine-tuned version of [NlpHUST/ner-vietnamese-electra-base](https://huggingface.co/NlpHUST/ner-vietnamese-electra-base) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.3789
18
- - Age: {'precision': 0.8503401360544217, 'recall': 0.946969696969697, 'f1': 0.8960573476702508, 'number': 132}
19
- - Datetime: {'precision': 0.6935483870967742, 'recall': 0.7428861788617886, 'f1': 0.7173699705593719, 'number': 984}
20
- - Disease: {'precision': 0.6895306859205776, 'recall': 0.6749116607773852, 'f1': 0.6821428571428573, 'number': 283}
21
- - Event: {'precision': 0.3210702341137124, 'recall': 0.36363636363636365, 'f1': 0.3410301953818828, 'number': 264}
22
- - Gender: {'precision': 0.7704918032786885, 'recall': 0.8245614035087719, 'f1': 0.7966101694915254, 'number': 114}
23
- - Law: {'precision': 0.5617283950617284, 'recall': 0.7193675889328063, 'f1': 0.6308492201039861, 'number': 253}
24
- - Location: {'precision': 0.6985105290190036, 'recall': 0.7435757244395844, 'f1': 0.7203389830508473, 'number': 1829}
25
- - Organization: {'precision': 0.640555906506633, 'recall': 0.7211948790896159, 'f1': 0.6784877885580463, 'number': 1406}
26
- - Person: {'precision': 0.7024147727272727, 'recall': 0.7408239700374532, 'f1': 0.7211082756106453, 'number': 1335}
27
- - Phone: {'precision': 0.8705882352941177, 'recall': 0.9487179487179487, 'f1': 0.9079754601226994, 'number': 78}
28
- - Product: {'precision': 0.3686274509803922, 'recall': 0.3671875, 'f1': 0.36790606653620356, 'number': 256}
29
- - Quantity: {'precision': 0.5566502463054187, 'recall': 0.6231617647058824, 'f1': 0.588031222896791, 'number': 544}
30
- - Role: {'precision': 0.4342560553633218, 'recall': 0.4836223506743738, 'f1': 0.45761166818596166, 'number': 519}
31
- - Transportation: {'precision': 0.49122807017543857, 'recall': 0.6086956521739131, 'f1': 0.5436893203883495, 'number': 138}
32
- - Overall Precision: 0.6348
33
- - Overall Recall: 0.6913
34
- - Overall F1: 0.6619
35
- - Overall Accuracy: 0.8912
36
 
37
  ## Model description
38
 
@@ -57,14 +57,17 @@ The following hyperparameters were used during training:
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
  - lr_scheduler_type: cosine
60
- - num_epochs: 5
61
 
62
  ### Training results
63
 
64
- | Training Loss | Epoch | Step | Validation Loss | Age | Datetime | Disease | Event | Gender | Law | Location | Organization | Person | Phone | Product | Quantity | Role | Transportation | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
65
- |:-------------:|:------:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
66
- | 0.29 | 1.9991 | 2313 | 0.3353 | {'precision': 0.8561643835616438, 'recall': 0.946969696969697, 'f1': 0.8992805755395684, 'number': 132} | {'precision': 0.707647628267183, 'recall': 0.7428861788617886, 'f1': 0.7248388696083291, 'number': 984} | {'precision': 0.6946564885496184, 'recall': 0.6431095406360424, 'f1': 0.6678899082568808, 'number': 283} | {'precision': 0.34191176470588236, 'recall': 0.3522727272727273, 'f1': 0.34701492537313433, 'number': 264} | {'precision': 0.7560975609756098, 'recall': 0.8157894736842105, 'f1': 0.7848101265822786, 'number': 114} | {'precision': 0.5384615384615384, 'recall': 0.6363636363636364, 'f1': 0.5833333333333334, 'number': 253} | {'precision': 0.7157279489904357, 'recall': 0.7364680153089119, 'f1': 0.7259498787388844, 'number': 1829} | {'precision': 0.6326268464996788, 'recall': 0.7005689900426743, 'f1': 0.6648666891663854, 'number': 1406} | {'precision': 0.7298136645962733, 'recall': 0.704119850187266, 'f1': 0.7167365611894777, 'number': 1335} | {'precision': 0.8072289156626506, 'recall': 0.8589743589743589, 'f1': 0.8322981366459627, 'number': 78} | {'precision': 0.425, 'recall': 0.265625, 'f1': 0.32692307692307687, 'number': 256} | {'precision': 0.5797101449275363, 'recall': 0.5882352941176471, 'f1': 0.583941605839416, 'number': 544} | {'precision': 0.4549019607843137, 'recall': 0.44701348747591524, 'f1': 0.4509232264334305, 'number': 519} | {'precision': 0.5194805194805194, 'recall': 0.5797101449275363, 'f1': 0.5479452054794519, 'number': 138} | 0.6518 | 0.6667 | 0.6592 | 0.8937 |
67
- | 0.1806 | 3.9983 | 4626 | 0.3789 | {'precision': 0.8503401360544217, 'recall': 0.946969696969697, 'f1': 0.8960573476702508, 'number': 132} | {'precision': 0.6935483870967742, 'recall': 0.7428861788617886, 'f1': 0.7173699705593719, 'number': 984} | {'precision': 0.6895306859205776, 'recall': 0.6749116607773852, 'f1': 0.6821428571428573, 'number': 283} | {'precision': 0.3210702341137124, 'recall': 0.36363636363636365, 'f1': 0.3410301953818828, 'number': 264} | {'precision': 0.7704918032786885, 'recall': 0.8245614035087719, 'f1': 0.7966101694915254, 'number': 114} | {'precision': 0.5617283950617284, 'recall': 0.7193675889328063, 'f1': 0.6308492201039861, 'number': 253} | {'precision': 0.6985105290190036, 'recall': 0.7435757244395844, 'f1': 0.7203389830508473, 'number': 1829} | {'precision': 0.640555906506633, 'recall': 0.7211948790896159, 'f1': 0.6784877885580463, 'number': 1406} | {'precision': 0.7024147727272727, 'recall': 0.7408239700374532, 'f1': 0.7211082756106453, 'number': 1335} | {'precision': 0.8705882352941177, 'recall': 0.9487179487179487, 'f1': 0.9079754601226994, 'number': 78} | {'precision': 0.3686274509803922, 'recall': 0.3671875, 'f1': 0.36790606653620356, 'number': 256} | {'precision': 0.5566502463054187, 'recall': 0.6231617647058824, 'f1': 0.588031222896791, 'number': 544} | {'precision': 0.4342560553633218, 'recall': 0.4836223506743738, 'f1': 0.45761166818596166, 'number': 519} | {'precision': 0.49122807017543857, 'recall': 0.6086956521739131, 'f1': 0.5436893203883495, 'number': 138} | 0.6348 | 0.6913 | 0.6619 | 0.8912 |
 
 
 
68
 
69
 
70
  ### Framework versions
 
1
  ---
2
+ base_model: lilyyellow/my_awesome_ner-token_classification_v1.0.7-5
3
  tags:
4
  - generated_from_trainer
5
  model-index:
 
12
 
13
  # my_awesome_ner-token_classification_v1.0.7-5
14
 
15
+ This model is a fine-tuned version of [lilyyellow/my_awesome_ner-token_classification_v1.0.7-5](https://huggingface.co/lilyyellow/my_awesome_ner-token_classification_v1.0.7-5) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.6412
18
+ - Age: {'precision': 0.8231292517006803, 'recall': 0.9166666666666666, 'f1': 0.8673835125448027, 'number': 132}
19
+ - Datetime: {'precision': 0.7318548387096774, 'recall': 0.7378048780487805, 'f1': 0.7348178137651821, 'number': 984}
20
+ - Disease: {'precision': 0.677536231884058, 'recall': 0.6607773851590106, 'f1': 0.669051878354204, 'number': 283}
21
+ - Event: {'precision': 0.30604982206405695, 'recall': 0.32575757575757575, 'f1': 0.3155963302752293, 'number': 264}
22
+ - Gender: {'precision': 0.7419354838709677, 'recall': 0.8070175438596491, 'f1': 0.7731092436974789, 'number': 114}
23
+ - Law: {'precision': 0.5209003215434084, 'recall': 0.6403162055335968, 'f1': 0.5744680851063829, 'number': 253}
24
+ - Location: {'precision': 0.7192796610169492, 'recall': 0.7424822307271733, 'f1': 0.7306967984934087, 'number': 1829}
25
+ - Organization: {'precision': 0.656473649967469, 'recall': 0.7176386913229018, 'f1': 0.6856948691811077, 'number': 1406}
26
+ - Person: {'precision': 0.7022955523672884, 'recall': 0.7333333333333333, 'f1': 0.717478930010993, 'number': 1335}
27
+ - Phone: {'precision': 0.8837209302325582, 'recall': 0.9743589743589743, 'f1': 0.9268292682926831, 'number': 78}
28
+ - Product: {'precision': 0.4470046082949309, 'recall': 0.37890625, 'f1': 0.4101479915433404, 'number': 256}
29
+ - Quantity: {'precision': 0.5621890547263682, 'recall': 0.6231617647058824, 'f1': 0.5911072362685265, 'number': 544}
30
+ - Role: {'precision': 0.47593582887700536, 'recall': 0.5144508670520231, 'f1': 0.49444444444444446, 'number': 519}
31
+ - Transportation: {'precision': 0.5028571428571429, 'recall': 0.6376811594202898, 'f1': 0.5623003194888179, 'number': 138}
32
+ - Overall Precision: 0.6503
33
+ - Overall Recall: 0.6868
34
+ - Overall F1: 0.6680
35
+ - Overall Accuracy: 0.8884
36
 
37
  ## Model description
38
 
 
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
  - lr_scheduler_type: cosine
60
+ - num_epochs: 10
61
 
62
  ### Training results
63
 
64
+ | Training Loss | Epoch | Step | Validation Loss | Age | Datetime | Disease | Event | Gender | Law | Location | Organization | Person | Phone | Product | Quantity | Role | Transportation | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
65
+ |:-------------:|:------:|:-----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
66
+ | 0.132 | 1.9991 | 2313 | 0.4438 | {'precision': 0.8482758620689655, 'recall': 0.9318181818181818, 'f1': 0.8880866425992779, 'number': 132} | {'precision': 0.7001897533206831, 'recall': 0.75, 'f1': 0.7242394504416094, 'number': 984} | {'precision': 0.7104247104247104, 'recall': 0.6501766784452296, 'f1': 0.6789667896678967, 'number': 283} | {'precision': 0.30029154518950435, 'recall': 0.39015151515151514, 'f1': 0.3393739703459638, 'number': 264} | {'precision': 0.7647058823529411, 'recall': 0.7982456140350878, 'f1': 0.7811158798283262, 'number': 114} | {'precision': 0.5693430656934306, 'recall': 0.616600790513834, 'f1': 0.5920303605313093, 'number': 253} | {'precision': 0.7090248962655602, 'recall': 0.7474029524330235, 'f1': 0.7277082778812883, 'number': 1829} | {'precision': 0.6387607119314437, 'recall': 0.6891891891891891, 'f1': 0.6630174478275744, 'number': 1406} | {'precision': 0.6537414965986394, 'recall': 0.7198501872659177, 'f1': 0.685204991087344, 'number': 1335} | {'precision': 0.7888888888888889, 'recall': 0.9102564102564102, 'f1': 0.8452380952380951, 'number': 78} | {'precision': 0.37606837606837606, 'recall': 0.34375, 'f1': 0.35918367346938773, 'number': 256} | {'precision': 0.6036217303822937, 'recall': 0.5514705882352942, 'f1': 0.5763688760806917, 'number': 544} | {'precision': 0.4448462929475588, 'recall': 0.47398843930635837, 'f1': 0.458955223880597, 'number': 519} | {'precision': 0.4489795918367347, 'recall': 0.6376811594202898, 'f1': 0.5269461077844311, 'number': 138} | 0.6320 | 0.6742 | 0.6524 | 0.8866 |
67
+ | 0.1236 | 3.9983 | 4626 | 0.4916 | {'precision': 0.8299319727891157, 'recall': 0.9242424242424242, 'f1': 0.8745519713261649, 'number': 132} | {'precision': 0.6889952153110048, 'recall': 0.7317073170731707, 'f1': 0.7097092163627403, 'number': 984} | {'precision': 0.6178343949044586, 'recall': 0.6855123674911661, 'f1': 0.6499162479061976, 'number': 283} | {'precision': 0.26216216216216215, 'recall': 0.36742424242424243, 'f1': 0.305993690851735, 'number': 264} | {'precision': 0.7711864406779662, 'recall': 0.7982456140350878, 'f1': 0.7844827586206897, 'number': 114} | {'precision': 0.5325077399380805, 'recall': 0.6798418972332015, 'f1': 0.5972222222222223, 'number': 253} | {'precision': 0.6995329527763363, 'recall': 0.7370147621651175, 'f1': 0.7177848775292864, 'number': 1829} | {'precision': 0.6458598726114649, 'recall': 0.7211948790896159, 'f1': 0.6814516129032258, 'number': 1406} | {'precision': 0.64526588845655, 'recall': 0.7453183520599251, 'f1': 0.691692735488356, 'number': 1335} | {'precision': 0.9156626506024096, 'recall': 0.9743589743589743, 'f1': 0.9440993788819876, 'number': 78} | {'precision': 0.3524904214559387, 'recall': 0.359375, 'f1': 0.3558994197292069, 'number': 256} | {'precision': 0.5358851674641149, 'recall': 0.6176470588235294, 'f1': 0.5738684884713919, 'number': 544} | {'precision': 0.4106060606060606, 'recall': 0.5221579961464354, 'f1': 0.45971162001696353, 'number': 519} | {'precision': 0.5416666666666666, 'recall': 0.6594202898550725, 'f1': 0.5947712418300652, 'number': 138} | 0.6138 | 0.6907 | 0.6500 | 0.8800 |
68
+ | 0.0909 | 5.9974 | 6939 | 0.5451 | {'precision': 0.8413793103448276, 'recall': 0.9242424242424242, 'f1': 0.8808664259927798, 'number': 132} | {'precision': 0.7414684591520165, 'recall': 0.7286585365853658, 'f1': 0.735007688364941, 'number': 984} | {'precision': 0.7054263565891473, 'recall': 0.6431095406360424, 'f1': 0.6728280961182995, 'number': 283} | {'precision': 0.33613445378151263, 'recall': 0.30303030303030304, 'f1': 0.3187250996015936, 'number': 264} | {'precision': 0.7627118644067796, 'recall': 0.7894736842105263, 'f1': 0.7758620689655172, 'number': 114} | {'precision': 0.5496688741721855, 'recall': 0.6561264822134387, 'f1': 0.5981981981981982, 'number': 253} | {'precision': 0.7087024491922876, 'recall': 0.7435757244395844, 'f1': 0.7257203842049093, 'number': 1829} | {'precision': 0.6441326530612245, 'recall': 0.7183499288762447, 'f1': 0.6792199058507061, 'number': 1406} | {'precision': 0.6782246879334258, 'recall': 0.7325842696629213, 'f1': 0.7043572200216061, 'number': 1335} | {'precision': 0.8941176470588236, 'recall': 0.9743589743589743, 'f1': 0.9325153374233129, 'number': 78} | {'precision': 0.43564356435643564, 'recall': 0.34375, 'f1': 0.38427947598253276, 'number': 256} | {'precision': 0.5513866231647635, 'recall': 0.6213235294117647, 'f1': 0.5842696629213483, 'number': 544} | {'precision': 0.4785046728971963, 'recall': 0.4932562620423892, 'f1': 0.4857685009487666, 'number': 519} | {'precision': 0.50920245398773, 'recall': 0.6014492753623188, 'f1': 0.5514950166112956, 'number': 138} | 0.6483 | 0.6817 | 0.6646 | 0.8882 |
69
+ | 0.0531 | 7.9965 | 9252 | 0.6110 | {'precision': 0.8356164383561644, 'recall': 0.9242424242424242, 'f1': 0.8776978417266188, 'number': 132} | {'precision': 0.7186274509803922, 'recall': 0.7449186991869918, 'f1': 0.7315369261477046, 'number': 984} | {'precision': 0.6541095890410958, 'recall': 0.6749116607773852, 'f1': 0.6643478260869565, 'number': 283} | {'precision': 0.30662020905923343, 'recall': 0.3333333333333333, 'f1': 0.3194192377495463, 'number': 264} | {'precision': 0.71875, 'recall': 0.8070175438596491, 'f1': 0.7603305785123967, 'number': 114} | {'precision': 0.5838926174496645, 'recall': 0.6877470355731226, 'f1': 0.6315789473684211, 'number': 253} | {'precision': 0.7138348237769595, 'recall': 0.7419354838709677, 'f1': 0.7276139410187666, 'number': 1829} | {'precision': 0.6483375959079284, 'recall': 0.7211948790896159, 'f1': 0.6828282828282827, 'number': 1406} | {'precision': 0.6911250873515025, 'recall': 0.7408239700374532, 'f1': 0.7151120751988431, 'number': 1335} | {'precision': 0.8837209302325582, 'recall': 0.9743589743589743, 'f1': 0.9268292682926831, 'number': 78} | {'precision': 0.45045045045045046, 'recall': 0.390625, 'f1': 0.4184100418410041, 'number': 256} | {'precision': 0.5617792421746294, 'recall': 0.6268382352941176, 'f1': 0.5925282363162467, 'number': 544} | {'precision': 0.4652777777777778, 'recall': 0.5163776493256262, 'f1': 0.48949771689497723, 'number': 519} | {'precision': 0.49444444444444446, 'recall': 0.644927536231884, 'f1': 0.559748427672956, 'number': 138} | 0.6448 | 0.6926 | 0.6678 | 0.8877 |
70
+ | 0.0441 | 9.9957 | 11565 | 0.6412 | {'precision': 0.8231292517006803, 'recall': 0.9166666666666666, 'f1': 0.8673835125448027, 'number': 132} | {'precision': 0.7318548387096774, 'recall': 0.7378048780487805, 'f1': 0.7348178137651821, 'number': 984} | {'precision': 0.677536231884058, 'recall': 0.6607773851590106, 'f1': 0.669051878354204, 'number': 283} | {'precision': 0.30604982206405695, 'recall': 0.32575757575757575, 'f1': 0.3155963302752293, 'number': 264} | {'precision': 0.7419354838709677, 'recall': 0.8070175438596491, 'f1': 0.7731092436974789, 'number': 114} | {'precision': 0.5209003215434084, 'recall': 0.6403162055335968, 'f1': 0.5744680851063829, 'number': 253} | {'precision': 0.7192796610169492, 'recall': 0.7424822307271733, 'f1': 0.7306967984934087, 'number': 1829} | {'precision': 0.656473649967469, 'recall': 0.7176386913229018, 'f1': 0.6856948691811077, 'number': 1406} | {'precision': 0.7022955523672884, 'recall': 0.7333333333333333, 'f1': 0.717478930010993, 'number': 1335} | {'precision': 0.8837209302325582, 'recall': 0.9743589743589743, 'f1': 0.9268292682926831, 'number': 78} | {'precision': 0.4470046082949309, 'recall': 0.37890625, 'f1': 0.4101479915433404, 'number': 256} | {'precision': 0.5621890547263682, 'recall': 0.6231617647058824, 'f1': 0.5911072362685265, 'number': 544} | {'precision': 0.47593582887700536, 'recall': 0.5144508670520231, 'f1': 0.49444444444444446, 'number': 519} | {'precision': 0.5028571428571429, 'recall': 0.6376811594202898, 'f1': 0.5623003194888179, 'number': 138} | 0.6503 | 0.6868 | 0.6680 | 0.8884 |
71
 
72
 
73
  ### Framework versions