CTEBMSP_10e_FLERT / training.log
roscazo's picture
Initial commit
6f11da3
raw
history blame
20.1 kB
2023-04-11 07:52:01,240 ----------------------------------------------------------------------------------------------------
2023-04-11 07:52:01,244 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): RobertaModel(
(embeddings): RobertaEmbeddings(
(word_embeddings): Embedding(50263, 768)
(position_embeddings): Embedding(514, 768, padding_idx=1)
(token_type_embeddings): Embedding(1, 768)
(LayerNorm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): RobertaEncoder(
(layer): ModuleList(
(0-11): 12 x RobertaLayer(
(attention): RobertaAttention(
(self): RobertaSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): RobertaSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): RobertaIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): RobertaOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): RobertaPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=17, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-04-11 07:52:01,245 ----------------------------------------------------------------------------------------------------
2023-04-11 07:52:01,247 Corpus: "Corpus: 12554 train + 4549 dev + 4505 test sentences"
2023-04-11 07:52:01,248 ----------------------------------------------------------------------------------------------------
2023-04-11 07:52:01,250 Parameters:
2023-04-11 07:52:01,252 - learning_rate: "0.000050"
2023-04-11 07:52:01,253 - mini_batch_size: "4"
2023-04-11 07:52:01,254 - patience: "3"
2023-04-11 07:52:01,256 - anneal_factor: "0.5"
2023-04-11 07:52:01,257 - max_epochs: "10"
2023-04-11 07:52:01,258 - shuffle: "True"
2023-04-11 07:52:01,259 - train_with_dev: "True"
2023-04-11 07:52:01,260 - batch_growth_annealing: "False"
2023-04-11 07:52:01,262 ----------------------------------------------------------------------------------------------------
2023-04-11 07:52:01,264 Model training base path: "CREBMSP_results"
2023-04-11 07:52:01,265 ----------------------------------------------------------------------------------------------------
2023-04-11 07:52:01,266 Device: cuda
2023-04-11 07:52:01,267 ----------------------------------------------------------------------------------------------------
2023-04-11 07:52:01,269 Embeddings storage mode: none
2023-04-11 07:52:01,270 ----------------------------------------------------------------------------------------------------
2023-04-11 07:52:31,267 epoch 1 - iter 427/4276 - loss 1.87909215 - time (sec): 30.00 - samples/sec: 1446.87 - lr: 0.000005
2023-04-11 07:52:57,233 epoch 1 - iter 854/4276 - loss 1.32536726 - time (sec): 55.96 - samples/sec: 1540.07 - lr: 0.000010
2023-04-11 07:53:22,647 epoch 1 - iter 1281/4276 - loss 1.12000789 - time (sec): 81.38 - samples/sec: 1412.04 - lr: 0.000015
2023-04-11 07:53:48,118 epoch 1 - iter 1708/4276 - loss 1.00885882 - time (sec): 106.85 - samples/sec: 1268.03 - lr: 0.000020
2023-04-11 07:54:13,232 epoch 1 - iter 2135/4276 - loss 0.90793861 - time (sec): 131.96 - samples/sec: 1192.25 - lr: 0.000025
2023-04-11 07:54:38,606 epoch 1 - iter 2562/4276 - loss 0.83160292 - time (sec): 157.33 - samples/sec: 1137.84 - lr: 0.000030
2023-04-11 07:55:03,961 epoch 1 - iter 2989/4276 - loss 0.76685321 - time (sec): 182.69 - samples/sec: 1097.97 - lr: 0.000035
2023-04-11 07:55:29,860 epoch 1 - iter 3416/4276 - loss 0.68896532 - time (sec): 208.59 - samples/sec: 1129.83 - lr: 0.000040
2023-04-11 07:55:55,140 epoch 1 - iter 3843/4276 - loss 0.64980627 - time (sec): 233.87 - samples/sec: 1106.99 - lr: 0.000045
2023-04-11 07:56:20,160 epoch 1 - iter 4270/4276 - loss 0.62203959 - time (sec): 258.89 - samples/sec: 1070.06 - lr: 0.000050
2023-04-11 07:56:20,508 ----------------------------------------------------------------------------------------------------
2023-04-11 07:56:20,510 EPOCH 1 done: loss 0.6216 - lr 0.000050
2023-04-11 07:56:22,961 ----------------------------------------------------------------------------------------------------
2023-04-11 07:56:48,475 epoch 2 - iter 427/4276 - loss 0.20826646 - time (sec): 25.51 - samples/sec: 1089.16 - lr: 0.000049
2023-04-11 07:57:14,086 epoch 2 - iter 854/4276 - loss 0.19309402 - time (sec): 51.12 - samples/sec: 1086.38 - lr: 0.000049
2023-04-11 07:57:39,771 epoch 2 - iter 1281/4276 - loss 0.19314959 - time (sec): 76.81 - samples/sec: 1082.26 - lr: 0.000048
2023-04-11 07:58:05,813 epoch 2 - iter 1708/4276 - loss 0.18982202 - time (sec): 102.85 - samples/sec: 1076.96 - lr: 0.000048
2023-04-11 07:58:31,469 epoch 2 - iter 2135/4276 - loss 0.18835936 - time (sec): 128.51 - samples/sec: 1075.89 - lr: 0.000047
2023-04-11 07:58:57,254 epoch 2 - iter 2562/4276 - loss 0.18721166 - time (sec): 154.29 - samples/sec: 1077.05 - lr: 0.000047
2023-04-11 07:59:22,930 epoch 2 - iter 2989/4276 - loss 0.18831955 - time (sec): 179.97 - samples/sec: 1077.28 - lr: 0.000046
2023-04-11 07:59:48,986 epoch 2 - iter 3416/4276 - loss 0.18784028 - time (sec): 206.02 - samples/sec: 1073.12 - lr: 0.000046
2023-04-11 08:00:14,438 epoch 2 - iter 3843/4276 - loss 0.18631720 - time (sec): 231.48 - samples/sec: 1075.27 - lr: 0.000045
2023-04-11 08:00:40,029 epoch 2 - iter 4270/4276 - loss 0.18545112 - time (sec): 257.07 - samples/sec: 1077.05 - lr: 0.000044
2023-04-11 08:00:40,402 ----------------------------------------------------------------------------------------------------
2023-04-11 08:00:40,404 EPOCH 2 done: loss 0.1853 - lr 0.000044
2023-04-11 08:00:43,081 ----------------------------------------------------------------------------------------------------
2023-04-11 08:01:08,689 epoch 3 - iter 427/4276 - loss 0.10756568 - time (sec): 25.61 - samples/sec: 1077.73 - lr: 0.000044
2023-04-11 08:01:34,223 epoch 3 - iter 854/4276 - loss 0.11256584 - time (sec): 51.14 - samples/sec: 1067.61 - lr: 0.000043
2023-04-11 08:01:59,709 epoch 3 - iter 1281/4276 - loss 0.11766577 - time (sec): 76.63 - samples/sec: 1063.30 - lr: 0.000043
2023-04-11 08:02:25,508 epoch 3 - iter 1708/4276 - loss 0.11967896 - time (sec): 102.42 - samples/sec: 1069.08 - lr: 0.000042
2023-04-11 08:02:51,126 epoch 3 - iter 2135/4276 - loss 0.12272097 - time (sec): 128.04 - samples/sec: 1068.42 - lr: 0.000042
2023-04-11 08:03:16,785 epoch 3 - iter 2562/4276 - loss 0.12613423 - time (sec): 153.70 - samples/sec: 1070.88 - lr: 0.000041
2023-04-11 08:03:42,674 epoch 3 - iter 2989/4276 - loss 0.12434777 - time (sec): 179.59 - samples/sec: 1073.70 - lr: 0.000041
2023-04-11 08:04:08,548 epoch 3 - iter 3416/4276 - loss 0.12561538 - time (sec): 205.46 - samples/sec: 1076.38 - lr: 0.000040
2023-04-11 08:04:34,388 epoch 3 - iter 3843/4276 - loss 0.12639782 - time (sec): 231.31 - samples/sec: 1077.42 - lr: 0.000039
2023-04-11 08:05:00,280 epoch 3 - iter 4270/4276 - loss 0.12565441 - time (sec): 257.20 - samples/sec: 1077.04 - lr: 0.000039
2023-04-11 08:05:00,628 ----------------------------------------------------------------------------------------------------
2023-04-11 08:05:00,630 EPOCH 3 done: loss 0.1255 - lr 0.000039
2023-04-11 08:05:03,316 ----------------------------------------------------------------------------------------------------
2023-04-11 08:05:29,064 epoch 4 - iter 427/4276 - loss 0.07937009 - time (sec): 25.75 - samples/sec: 1093.59 - lr: 0.000038
2023-04-11 08:05:55,266 epoch 4 - iter 854/4276 - loss 0.08553328 - time (sec): 51.95 - samples/sec: 1096.96 - lr: 0.000038
2023-04-11 08:06:21,370 epoch 4 - iter 1281/4276 - loss 0.08226230 - time (sec): 78.05 - samples/sec: 1077.95 - lr: 0.000037
2023-04-11 08:06:47,652 epoch 4 - iter 1708/4276 - loss 0.08759891 - time (sec): 104.33 - samples/sec: 1073.69 - lr: 0.000037
2023-04-11 08:07:13,692 epoch 4 - iter 2135/4276 - loss 0.08892818 - time (sec): 130.37 - samples/sec: 1075.03 - lr: 0.000036
2023-04-11 08:07:39,673 epoch 4 - iter 2562/4276 - loss 0.09054387 - time (sec): 156.36 - samples/sec: 1070.47 - lr: 0.000036
2023-04-11 08:08:05,603 epoch 4 - iter 2989/4276 - loss 0.09010262 - time (sec): 182.29 - samples/sec: 1068.84 - lr: 0.000035
2023-04-11 08:08:31,466 epoch 4 - iter 3416/4276 - loss 0.09103521 - time (sec): 208.15 - samples/sec: 1064.43 - lr: 0.000034
2023-04-11 08:08:57,317 epoch 4 - iter 3843/4276 - loss 0.09209534 - time (sec): 234.00 - samples/sec: 1065.67 - lr: 0.000034
2023-04-11 08:09:23,268 epoch 4 - iter 4270/4276 - loss 0.09259541 - time (sec): 259.95 - samples/sec: 1065.57 - lr: 0.000033
2023-04-11 08:09:23,618 ----------------------------------------------------------------------------------------------------
2023-04-11 08:09:23,619 EPOCH 4 done: loss 0.0926 - lr 0.000033
2023-04-11 08:09:26,348 ----------------------------------------------------------------------------------------------------
2023-04-11 08:09:52,083 epoch 5 - iter 427/4276 - loss 0.05592755 - time (sec): 25.73 - samples/sec: 1089.14 - lr: 0.000033
2023-04-11 08:10:17,950 epoch 5 - iter 854/4276 - loss 0.06527284 - time (sec): 51.60 - samples/sec: 1056.57 - lr: 0.000032
2023-04-11 08:10:43,825 epoch 5 - iter 1281/4276 - loss 0.06153976 - time (sec): 77.47 - samples/sec: 1056.26 - lr: 0.000032
2023-04-11 08:11:09,692 epoch 5 - iter 1708/4276 - loss 0.06749125 - time (sec): 103.34 - samples/sec: 1063.57 - lr: 0.000031
2023-04-11 08:11:35,614 epoch 5 - iter 2135/4276 - loss 0.06839364 - time (sec): 129.26 - samples/sec: 1068.27 - lr: 0.000031
2023-04-11 08:12:01,303 epoch 5 - iter 2562/4276 - loss 0.06963346 - time (sec): 154.95 - samples/sec: 1066.16 - lr: 0.000030
2023-04-11 08:12:27,328 epoch 5 - iter 2989/4276 - loss 0.06933764 - time (sec): 180.98 - samples/sec: 1070.11 - lr: 0.000029
2023-04-11 08:12:53,272 epoch 5 - iter 3416/4276 - loss 0.06831147 - time (sec): 206.92 - samples/sec: 1068.24 - lr: 0.000029
2023-04-11 08:13:19,128 epoch 5 - iter 3843/4276 - loss 0.06885265 - time (sec): 232.78 - samples/sec: 1069.77 - lr: 0.000028
2023-04-11 08:13:44,881 epoch 5 - iter 4270/4276 - loss 0.06861645 - time (sec): 258.53 - samples/sec: 1071.43 - lr: 0.000028
2023-04-11 08:13:45,250 ----------------------------------------------------------------------------------------------------
2023-04-11 08:13:45,251 EPOCH 5 done: loss 0.0687 - lr 0.000028
2023-04-11 08:13:47,855 ----------------------------------------------------------------------------------------------------
2023-04-11 08:14:13,715 epoch 6 - iter 427/4276 - loss 0.04965217 - time (sec): 25.86 - samples/sec: 1047.26 - lr: 0.000027
2023-04-11 08:14:39,500 epoch 6 - iter 854/4276 - loss 0.05200554 - time (sec): 51.64 - samples/sec: 1043.00 - lr: 0.000027
2023-04-11 08:15:05,494 epoch 6 - iter 1281/4276 - loss 0.04883649 - time (sec): 77.63 - samples/sec: 1053.18 - lr: 0.000026
2023-04-11 08:15:31,675 epoch 6 - iter 1708/4276 - loss 0.04860057 - time (sec): 103.82 - samples/sec: 1062.16 - lr: 0.000026
2023-04-11 08:15:57,397 epoch 6 - iter 2135/4276 - loss 0.04686293 - time (sec): 129.54 - samples/sec: 1064.28 - lr: 0.000025
2023-04-11 08:16:23,066 epoch 6 - iter 2562/4276 - loss 0.04688968 - time (sec): 155.21 - samples/sec: 1075.47 - lr: 0.000024
2023-04-11 08:16:48,784 epoch 6 - iter 2989/4276 - loss 0.04738732 - time (sec): 180.92 - samples/sec: 1076.18 - lr: 0.000024
2023-04-11 08:17:14,472 epoch 6 - iter 3416/4276 - loss 0.04857132 - time (sec): 206.61 - samples/sec: 1078.35 - lr: 0.000023
2023-04-11 08:17:40,237 epoch 6 - iter 3843/4276 - loss 0.04764392 - time (sec): 232.38 - samples/sec: 1078.25 - lr: 0.000023
2023-04-11 08:18:05,989 epoch 6 - iter 4270/4276 - loss 0.04784009 - time (sec): 258.13 - samples/sec: 1072.85 - lr: 0.000022
2023-04-11 08:18:06,327 ----------------------------------------------------------------------------------------------------
2023-04-11 08:18:06,329 EPOCH 6 done: loss 0.0478 - lr 0.000022
2023-04-11 08:18:08,965 ----------------------------------------------------------------------------------------------------
2023-04-11 08:18:34,621 epoch 7 - iter 427/4276 - loss 0.04169676 - time (sec): 25.65 - samples/sec: 1078.45 - lr: 0.000022
2023-04-11 08:19:00,288 epoch 7 - iter 854/4276 - loss 0.03889063 - time (sec): 51.32 - samples/sec: 1079.22 - lr: 0.000021
2023-04-11 08:19:25,845 epoch 7 - iter 1281/4276 - loss 0.03600230 - time (sec): 76.88 - samples/sec: 1074.59 - lr: 0.000021
2023-04-11 08:19:51,633 epoch 7 - iter 1708/4276 - loss 0.03408375 - time (sec): 102.67 - samples/sec: 1069.48 - lr: 0.000020
2023-04-11 08:20:17,371 epoch 7 - iter 2135/4276 - loss 0.03496732 - time (sec): 128.40 - samples/sec: 1071.00 - lr: 0.000019
2023-04-11 08:20:43,117 epoch 7 - iter 2562/4276 - loss 0.03456081 - time (sec): 154.15 - samples/sec: 1076.58 - lr: 0.000019
2023-04-11 08:21:08,941 epoch 7 - iter 2989/4276 - loss 0.03472130 - time (sec): 179.97 - samples/sec: 1080.88 - lr: 0.000018
2023-04-11 08:21:34,633 epoch 7 - iter 3416/4276 - loss 0.03388419 - time (sec): 205.67 - samples/sec: 1082.92 - lr: 0.000018
2023-04-11 08:22:00,268 epoch 7 - iter 3843/4276 - loss 0.03321656 - time (sec): 231.30 - samples/sec: 1079.07 - lr: 0.000017
2023-04-11 08:22:26,001 epoch 7 - iter 4270/4276 - loss 0.03294924 - time (sec): 257.03 - samples/sec: 1077.38 - lr: 0.000017
2023-04-11 08:22:26,358 ----------------------------------------------------------------------------------------------------
2023-04-11 08:22:26,359 EPOCH 7 done: loss 0.0329 - lr 0.000017
2023-04-11 08:22:28,991 ----------------------------------------------------------------------------------------------------
2023-04-11 08:22:54,759 epoch 8 - iter 427/4276 - loss 0.01991391 - time (sec): 25.77 - samples/sec: 1091.09 - lr: 0.000016
2023-04-11 08:23:20,455 epoch 8 - iter 854/4276 - loss 0.02008748 - time (sec): 51.46 - samples/sec: 1087.44 - lr: 0.000016
2023-04-11 08:23:46,301 epoch 8 - iter 1281/4276 - loss 0.02071964 - time (sec): 77.31 - samples/sec: 1091.13 - lr: 0.000015
2023-04-11 08:24:12,005 epoch 8 - iter 1708/4276 - loss 0.02060885 - time (sec): 103.01 - samples/sec: 1086.76 - lr: 0.000014
2023-04-11 08:24:37,602 epoch 8 - iter 2135/4276 - loss 0.02230171 - time (sec): 128.61 - samples/sec: 1081.19 - lr: 0.000014
2023-04-11 08:25:03,104 epoch 8 - iter 2562/4276 - loss 0.02194943 - time (sec): 154.11 - samples/sec: 1081.02 - lr: 0.000013
2023-04-11 08:25:28,792 epoch 8 - iter 2989/4276 - loss 0.02166994 - time (sec): 179.80 - samples/sec: 1081.85 - lr: 0.000013
2023-04-11 08:25:54,314 epoch 8 - iter 3416/4276 - loss 0.02079076 - time (sec): 205.32 - samples/sec: 1078.58 - lr: 0.000012
2023-04-11 08:26:19,932 epoch 8 - iter 3843/4276 - loss 0.02085187 - time (sec): 230.94 - samples/sec: 1077.62 - lr: 0.000012
2023-04-11 08:26:45,880 epoch 8 - iter 4270/4276 - loss 0.02104430 - time (sec): 256.89 - samples/sec: 1077.94 - lr: 0.000011
2023-04-11 08:26:46,229 ----------------------------------------------------------------------------------------------------
2023-04-11 08:26:46,231 EPOCH 8 done: loss 0.0210 - lr 0.000011
2023-04-11 08:26:48,847 ----------------------------------------------------------------------------------------------------
2023-04-11 08:27:14,788 epoch 9 - iter 427/4276 - loss 0.01704588 - time (sec): 25.94 - samples/sec: 1092.91 - lr: 0.000011
2023-04-11 08:27:40,564 epoch 9 - iter 854/4276 - loss 0.01373665 - time (sec): 51.72 - samples/sec: 1083.59 - lr: 0.000010
2023-04-11 08:28:06,247 epoch 9 - iter 1281/4276 - loss 0.01269875 - time (sec): 77.40 - samples/sec: 1099.73 - lr: 0.000009
2023-04-11 08:28:31,749 epoch 9 - iter 1708/4276 - loss 0.01307406 - time (sec): 102.90 - samples/sec: 1092.49 - lr: 0.000009
2023-04-11 08:28:57,340 epoch 9 - iter 2135/4276 - loss 0.01330464 - time (sec): 128.49 - samples/sec: 1083.63 - lr: 0.000008
2023-04-11 08:29:23,005 epoch 9 - iter 2562/4276 - loss 0.01323370 - time (sec): 154.16 - samples/sec: 1084.86 - lr: 0.000008
2023-04-11 08:29:48,714 epoch 9 - iter 2989/4276 - loss 0.01356354 - time (sec): 179.87 - samples/sec: 1081.40 - lr: 0.000007
2023-04-11 08:30:14,522 epoch 9 - iter 3416/4276 - loss 0.01333538 - time (sec): 205.67 - samples/sec: 1080.12 - lr: 0.000007
2023-04-11 08:30:40,139 epoch 9 - iter 3843/4276 - loss 0.01382847 - time (sec): 231.29 - samples/sec: 1076.78 - lr: 0.000006
2023-04-11 08:31:05,963 epoch 9 - iter 4270/4276 - loss 0.01417043 - time (sec): 257.11 - samples/sec: 1077.64 - lr: 0.000006
2023-04-11 08:31:06,310 ----------------------------------------------------------------------------------------------------
2023-04-11 08:31:06,312 EPOCH 9 done: loss 0.0142 - lr 0.000006
2023-04-11 08:31:08,911 ----------------------------------------------------------------------------------------------------
2023-04-11 08:31:34,627 epoch 10 - iter 427/4276 - loss 0.00788266 - time (sec): 25.71 - samples/sec: 1100.38 - lr: 0.000005
2023-04-11 08:32:00,278 epoch 10 - iter 854/4276 - loss 0.00916004 - time (sec): 51.37 - samples/sec: 1082.68 - lr: 0.000004
2023-04-11 08:32:25,952 epoch 10 - iter 1281/4276 - loss 0.00947741 - time (sec): 77.04 - samples/sec: 1084.11 - lr: 0.000004
2023-04-11 08:32:51,619 epoch 10 - iter 1708/4276 - loss 0.00922028 - time (sec): 102.71 - samples/sec: 1082.23 - lr: 0.000003
2023-04-11 08:33:17,397 epoch 10 - iter 2135/4276 - loss 0.00924503 - time (sec): 128.48 - samples/sec: 1087.21 - lr: 0.000003
2023-04-11 08:33:43,209 epoch 10 - iter 2562/4276 - loss 0.00928543 - time (sec): 154.30 - samples/sec: 1085.54 - lr: 0.000002
2023-04-11 08:34:09,247 epoch 10 - iter 2989/4276 - loss 0.00893538 - time (sec): 180.33 - samples/sec: 1082.30 - lr: 0.000002
2023-04-11 08:34:35,096 epoch 10 - iter 3416/4276 - loss 0.00939691 - time (sec): 206.18 - samples/sec: 1079.56 - lr: 0.000001
2023-04-11 08:35:01,291 epoch 10 - iter 3843/4276 - loss 0.00881917 - time (sec): 232.38 - samples/sec: 1073.84 - lr: 0.000001
2023-04-11 08:35:27,885 epoch 10 - iter 4270/4276 - loss 0.00882288 - time (sec): 258.97 - samples/sec: 1069.59 - lr: 0.000000
2023-04-11 08:35:28,233 ----------------------------------------------------------------------------------------------------
2023-04-11 08:35:28,234 EPOCH 10 done: loss 0.0088 - lr 0.000000
2023-04-11 08:35:36,527 ----------------------------------------------------------------------------------------------------
2023-04-11 08:35:36,530 Testing using last state of model ...
2023-04-11 08:36:06,557 Evaluating as a multi-label problem: False
2023-04-11 08:36:06,627 0.877 0.884 0.8805 0.7929
2023-04-11 08:36:06,629
Results:
- F-score (micro) 0.8805
- F-score (macro) 0.8612
- Accuracy 0.7929
By class:
precision recall f1-score support
PROC 0.8581 0.8811 0.8695 3364
DISO 0.8911 0.8908 0.8910 2472
CHEM 0.9091 0.9073 0.9082 1565
ANAT 0.8082 0.7468 0.7763 316
micro avg 0.8770 0.8840 0.8805 7717
macro avg 0.8666 0.8565 0.8612 7717
weighted avg 0.8770 0.8840 0.8804 7717
2023-04-11 08:36:06,629 ----------------------------------------------------------------------------------------------------