disk0dancer commited on
Commit
2bf2c23
·
verified ·
1 Parent(s): c46b38d

End of training

Browse files
README.md CHANGED
@@ -20,11 +20,11 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [ai-forever/ruBert-base](https://huggingface.co/ai-forever/ruBert-base) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.4334
24
- - Precision: 0.6105
25
- - Recall: 0.5297
26
- - F1: 0.5672
27
- - Accuracy: 0.6302
28
 
29
  ## Model description
30
 
@@ -55,56 +55,56 @@ The following hyperparameters were used during training:
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
- | No log | 1.0 | 2 | 3.3953 | 0.0380 | 0.0365 | 0.0373 | 0.0227 |
59
- | No log | 2.0 | 4 | 2.8484 | 0.0487 | 0.0251 | 0.0331 | 0.0124 |
60
- | No log | 3.0 | 6 | 2.3028 | 0.0 | 0.0 | 0.0 | 0.0021 |
61
- | No log | 4.0 | 8 | 1.8223 | 0.0 | 0.0 | 0.0 | 0.0 |
62
- | No log | 5.0 | 10 | 1.4954 | 0.0 | 0.0 | 0.0 | 0.0 |
63
- | No log | 6.0 | 12 | 1.3017 | 0.0 | 0.0 | 0.0 | 0.0 |
64
- | No log | 7.0 | 14 | 1.1878 | 0.0 | 0.0 | 0.0 | 0.0 |
65
- | No log | 8.0 | 16 | 1.1186 | 0.0 | 0.0 | 0.0 | 0.0 |
66
- | No log | 9.0 | 18 | 1.0686 | 0.0 | 0.0 | 0.0 | 0.0 |
67
- | No log | 10.0 | 20 | 1.0243 | 0.0 | 0.0 | 0.0 | 0.0 |
68
- | No log | 11.0 | 22 | 0.9837 | 0.0 | 0.0 | 0.0 | 0.0 |
69
- | No log | 12.0 | 24 | 0.9464 | 0.0 | 0.0 | 0.0 | 0.0 |
70
- | No log | 13.0 | 26 | 0.9117 | 0.0 | 0.0 | 0.0 | 0.0041 |
71
- | No log | 14.0 | 28 | 0.8793 | 0.0 | 0.0 | 0.0 | 0.0537 |
72
- | No log | 15.0 | 30 | 0.8493 | 0.0 | 0.0 | 0.0 | 0.1715 |
73
- | No log | 16.0 | 32 | 0.8223 | 0.5 | 0.0023 | 0.0045 | 0.2169 |
74
- | No log | 17.0 | 34 | 0.7976 | 0.5 | 0.0320 | 0.0601 | 0.2665 |
75
- | No log | 18.0 | 36 | 0.7738 | 0.5147 | 0.0799 | 0.1383 | 0.3140 |
76
- | No log | 19.0 | 38 | 0.7509 | 0.48 | 0.1096 | 0.1784 | 0.3388 |
77
- | No log | 20.0 | 40 | 0.7292 | 0.4599 | 0.1438 | 0.2191 | 0.3698 |
78
- | No log | 21.0 | 42 | 0.7088 | 0.4444 | 0.1826 | 0.2589 | 0.4029 |
79
- | No log | 22.0 | 44 | 0.6889 | 0.4369 | 0.2055 | 0.2795 | 0.4215 |
80
- | No log | 23.0 | 46 | 0.6690 | 0.4328 | 0.2352 | 0.3047 | 0.4442 |
81
- | No log | 24.0 | 48 | 0.6498 | 0.4444 | 0.2648 | 0.3319 | 0.4690 |
82
- | No log | 25.0 | 50 | 0.6306 | 0.4513 | 0.2854 | 0.3497 | 0.4835 |
83
- | No log | 26.0 | 52 | 0.6117 | 0.4574 | 0.2945 | 0.3583 | 0.4897 |
84
- | No log | 27.0 | 54 | 0.5937 | 0.4510 | 0.2945 | 0.3564 | 0.4959 |
85
- | No log | 28.0 | 56 | 0.5770 | 0.4558 | 0.3059 | 0.3661 | 0.5041 |
86
- | No log | 29.0 | 58 | 0.5616 | 0.4852 | 0.3379 | 0.3984 | 0.5269 |
87
- | No log | 30.0 | 60 | 0.5475 | 0.5 | 0.3562 | 0.416 | 0.5393 |
88
- | No log | 31.0 | 62 | 0.5347 | 0.5077 | 0.3767 | 0.4325 | 0.5517 |
89
- | No log | 32.0 | 64 | 0.5225 | 0.5301 | 0.4018 | 0.4571 | 0.5640 |
90
- | No log | 33.0 | 66 | 0.5112 | 0.5341 | 0.4110 | 0.4645 | 0.5702 |
91
- | No log | 34.0 | 68 | 0.5012 | 0.5449 | 0.4292 | 0.4802 | 0.5826 |
92
- | No log | 35.0 | 70 | 0.4924 | 0.5447 | 0.4315 | 0.4815 | 0.5826 |
93
- | No log | 36.0 | 72 | 0.4845 | 0.5537 | 0.4475 | 0.4949 | 0.5930 |
94
- | No log | 37.0 | 74 | 0.4773 | 0.5577 | 0.4521 | 0.4994 | 0.5971 |
95
- | No log | 38.0 | 76 | 0.4706 | 0.5635 | 0.4658 | 0.51 | 0.6033 |
96
- | No log | 39.0 | 78 | 0.4645 | 0.5671 | 0.4726 | 0.5156 | 0.6054 |
97
- | No log | 40.0 | 80 | 0.4590 | 0.5772 | 0.4863 | 0.5279 | 0.6116 |
98
- | No log | 41.0 | 82 | 0.4540 | 0.5822 | 0.4932 | 0.5340 | 0.6136 |
99
- | No log | 42.0 | 84 | 0.4496 | 0.5903 | 0.5 | 0.5414 | 0.6157 |
100
- | No log | 43.0 | 86 | 0.4457 | 0.5963 | 0.5091 | 0.5493 | 0.6178 |
101
- | No log | 44.0 | 88 | 0.4425 | 0.6021 | 0.5183 | 0.5571 | 0.6240 |
102
- | No log | 45.0 | 90 | 0.4398 | 0.6021 | 0.5183 | 0.5571 | 0.6240 |
103
- | No log | 46.0 | 92 | 0.4376 | 0.6042 | 0.5228 | 0.5606 | 0.6260 |
104
- | No log | 47.0 | 94 | 0.4358 | 0.6042 | 0.5228 | 0.5606 | 0.6260 |
105
- | No log | 48.0 | 96 | 0.4345 | 0.6079 | 0.5274 | 0.5648 | 0.6281 |
106
- | No log | 49.0 | 98 | 0.4337 | 0.6105 | 0.5297 | 0.5672 | 0.6302 |
107
- | No log | 50.0 | 100 | 0.4334 | 0.6105 | 0.5297 | 0.5672 | 0.6302 |
108
 
109
 
110
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [ai-forever/ruBert-base](https://huggingface.co/ai-forever/ruBert-base) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.4473
24
+ - Precision: 0.5945
25
+ - Recall: 0.4954
26
+ - F1: 0.5405
27
+ - Accuracy: 0.6219
28
 
29
  ## Model description
30
 
 
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | No log | 1.0 | 2 | 3.2246 | 0.0300 | 0.0160 | 0.0209 | 0.0062 |
59
+ | No log | 2.0 | 4 | 2.6688 | 0.0 | 0.0 | 0.0 | 0.0021 |
60
+ | No log | 3.0 | 6 | 2.1227 | 0.0 | 0.0 | 0.0 | 0.0021 |
61
+ | No log | 4.0 | 8 | 1.6906 | 0.0 | 0.0 | 0.0 | 0.0 |
62
+ | No log | 5.0 | 10 | 1.4171 | 0.0 | 0.0 | 0.0 | 0.0 |
63
+ | No log | 6.0 | 12 | 1.2636 | 0.0 | 0.0 | 0.0 | 0.0 |
64
+ | No log | 7.0 | 14 | 1.1762 | 0.0 | 0.0 | 0.0 | 0.0 |
65
+ | No log | 8.0 | 16 | 1.1150 | 0.0 | 0.0 | 0.0 | 0.0 |
66
+ | No log | 9.0 | 18 | 1.0601 | 0.0 | 0.0 | 0.0 | 0.0 |
67
+ | No log | 10.0 | 20 | 1.0094 | 0.0 | 0.0 | 0.0 | 0.0 |
68
+ | No log | 11.0 | 22 | 0.9662 | 0.0 | 0.0 | 0.0 | 0.0021 |
69
+ | No log | 12.0 | 24 | 0.9311 | 0.0 | 0.0 | 0.0 | 0.0124 |
70
+ | No log | 13.0 | 26 | 0.9011 | 0.0 | 0.0 | 0.0 | 0.0847 |
71
+ | No log | 14.0 | 28 | 0.8737 | 0.0 | 0.0 | 0.0 | 0.1921 |
72
+ | No log | 15.0 | 30 | 0.8476 | 0.0 | 0.0 | 0.0 | 0.2231 |
73
+ | No log | 16.0 | 32 | 0.8230 | 0.0 | 0.0 | 0.0 | 0.2335 |
74
+ | No log | 17.0 | 34 | 0.7996 | 0.5 | 0.0160 | 0.0310 | 0.25 |
75
+ | No log | 18.0 | 36 | 0.7772 | 0.5 | 0.0342 | 0.0641 | 0.2665 |
76
+ | No log | 19.0 | 38 | 0.7550 | 0.4630 | 0.0571 | 0.1016 | 0.2913 |
77
+ | No log | 20.0 | 40 | 0.7323 | 0.4706 | 0.0731 | 0.1265 | 0.3079 |
78
+ | No log | 21.0 | 42 | 0.7100 | 0.4333 | 0.0890 | 0.1477 | 0.3244 |
79
+ | No log | 22.0 | 44 | 0.6888 | 0.4122 | 0.1233 | 0.1898 | 0.3595 |
80
+ | No log | 23.0 | 46 | 0.6686 | 0.3778 | 0.1553 | 0.2201 | 0.3967 |
81
+ | No log | 24.0 | 48 | 0.6490 | 0.3972 | 0.1941 | 0.2607 | 0.4236 |
82
+ | No log | 25.0 | 50 | 0.6304 | 0.4149 | 0.2283 | 0.2946 | 0.4483 |
83
+ | No log | 26.0 | 52 | 0.6130 | 0.4504 | 0.2694 | 0.3371 | 0.4773 |
84
+ | No log | 27.0 | 54 | 0.5967 | 0.4593 | 0.2831 | 0.3503 | 0.4855 |
85
+ | No log | 28.0 | 56 | 0.5815 | 0.4657 | 0.2945 | 0.3608 | 0.4938 |
86
+ | No log | 29.0 | 58 | 0.5675 | 0.4842 | 0.3151 | 0.3817 | 0.5041 |
87
+ | No log | 30.0 | 60 | 0.5545 | 0.4916 | 0.3356 | 0.3989 | 0.5165 |
88
+ | No log | 31.0 | 62 | 0.5423 | 0.4967 | 0.3447 | 0.4070 | 0.5269 |
89
+ | No log | 32.0 | 64 | 0.5311 | 0.5016 | 0.3539 | 0.4150 | 0.5372 |
90
+ | No log | 33.0 | 66 | 0.5209 | 0.5016 | 0.3539 | 0.4150 | 0.5372 |
91
+ | No log | 34.0 | 68 | 0.5118 | 0.5063 | 0.3653 | 0.4244 | 0.5455 |
92
+ | No log | 35.0 | 70 | 0.5035 | 0.5140 | 0.3767 | 0.4348 | 0.5537 |
93
+ | No log | 36.0 | 72 | 0.4960 | 0.5105 | 0.3881 | 0.4410 | 0.5599 |
94
+ | No log | 37.0 | 74 | 0.4891 | 0.5208 | 0.3995 | 0.4522 | 0.5682 |
95
+ | No log | 38.0 | 76 | 0.4827 | 0.5249 | 0.4087 | 0.4596 | 0.5723 |
96
+ | No log | 39.0 | 78 | 0.4770 | 0.5407 | 0.4247 | 0.4757 | 0.5806 |
97
+ | No log | 40.0 | 80 | 0.4719 | 0.5473 | 0.4361 | 0.4854 | 0.5888 |
98
+ | No log | 41.0 | 82 | 0.4673 | 0.5568 | 0.4475 | 0.4962 | 0.5971 |
99
+ | No log | 42.0 | 84 | 0.4632 | 0.5581 | 0.4498 | 0.4981 | 0.5992 |
100
+ | No log | 43.0 | 86 | 0.4597 | 0.5682 | 0.4658 | 0.5119 | 0.6074 |
101
+ | No log | 44.0 | 88 | 0.4565 | 0.5754 | 0.4703 | 0.5176 | 0.6136 |
102
+ | No log | 45.0 | 90 | 0.4538 | 0.5766 | 0.4726 | 0.5194 | 0.6136 |
103
+ | No log | 46.0 | 92 | 0.4515 | 0.5810 | 0.4749 | 0.5226 | 0.6157 |
104
+ | No log | 47.0 | 94 | 0.4497 | 0.5845 | 0.4817 | 0.5282 | 0.6178 |
105
+ | No log | 48.0 | 96 | 0.4484 | 0.5918 | 0.4932 | 0.5380 | 0.6198 |
106
+ | No log | 49.0 | 98 | 0.4477 | 0.5918 | 0.4932 | 0.5380 | 0.6198 |
107
+ | No log | 50.0 | 100 | 0.4473 | 0.5945 | 0.4954 | 0.5405 | 0.6219 |
108
 
109
 
110
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b8891bff841bcafe1dff30007d44645a1135cc7b7fa5d2def73162fd1da7fbaa
3
  size 711062560
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:68d1e78a5fe838ede415e521343a41ae1544521e8c80c650a34a5d95301a8a67
3
  size 711062560
runs/Mar14_10-26-51_423946ca484c/events.out.tfevents.1710412034.423946ca484c.1459.4 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:cfc72e39570e246f653563d95010cacfc1dea397ab2548ad90f31a3cc3e753f1
3
- size 7960
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:62243c40fa87b5d9b7eb74313ab18e4ca9f21d30fa219ef7616757e22476a7bd
3
+ size 11201
runs/Mar14_10-29-50_423946ca484c/events.out.tfevents.1710412194.423946ca484c.1459.5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3af1a2f8d72831b2453e28374c91ec507539038d9c7a86eae74a418179ca7f35
3
+ size 30529
runs/Mar14_10-29-50_423946ca484c/events.out.tfevents.1710413242.423946ca484c.1459.6 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c1ead9f40cf8e1eac7eacfebc59f42f8cae3ecf6271f955101493b44ef6a589
3
+ size 551
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0dfd2673c88b6c3d74d72be6f1b9ae6a99ec21f7bf4e30444eac75bfc8facbb7
3
  size 4920
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:75bb1fb4593a8f50556e93e5c07f2008ebe25cd6a1f4b2e165c4bd2389c06877
3
  size 4920