File size: 26,574 Bytes
bf767d2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 |
2024-03-26 11:12:07,458 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(30001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-11): 12 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=17, bias=True)
(loss_function): CrossEntropyLoss()
)"
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Corpus: 758 train + 94 dev + 96 test sentences
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Train: 758 sentences
2024-03-26 11:12:07,459 (train_with_dev=False, train_with_test=False)
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Training Params:
2024-03-26 11:12:07,459 - learning_rate: "3e-05"
2024-03-26 11:12:07,459 - mini_batch_size: "16"
2024-03-26 11:12:07,459 - max_epochs: "10"
2024-03-26 11:12:07,459 - shuffle: "True"
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Plugins:
2024-03-26 11:12:07,459 - TensorboardLogger
2024-03-26 11:12:07,459 - LinearScheduler | warmup_fraction: '0.1'
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Final evaluation on model from best epoch (best-model.pt)
2024-03-26 11:12:07,459 - metric: "('micro avg', 'f1-score')"
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Computation:
2024-03-26 11:12:07,459 - compute on device: cuda:0
2024-03-26 11:12:07,459 - embedding storage: none
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Model training base path: "flair-co-funer-german_bert_base-bs16-e10-lr3e-05-2"
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:07,459 Logging anything other than scalars to TensorBoard is currently not supported.
2024-03-26 11:12:09,254 epoch 1 - iter 4/48 - loss 3.16306784 - time (sec): 1.79 - samples/sec: 1683.21 - lr: 0.000002 - momentum: 0.000000
2024-03-26 11:12:11,484 epoch 1 - iter 8/48 - loss 3.08728552 - time (sec): 4.02 - samples/sec: 1542.55 - lr: 0.000004 - momentum: 0.000000
2024-03-26 11:12:13,424 epoch 1 - iter 12/48 - loss 3.02503918 - time (sec): 5.96 - samples/sec: 1494.59 - lr: 0.000007 - momentum: 0.000000
2024-03-26 11:12:15,448 epoch 1 - iter 16/48 - loss 2.90887267 - time (sec): 7.99 - samples/sec: 1518.99 - lr: 0.000009 - momentum: 0.000000
2024-03-26 11:12:17,757 epoch 1 - iter 20/48 - loss 2.78404911 - time (sec): 10.30 - samples/sec: 1483.97 - lr: 0.000012 - momentum: 0.000000
2024-03-26 11:12:20,883 epoch 1 - iter 24/48 - loss 2.66287611 - time (sec): 13.42 - samples/sec: 1354.23 - lr: 0.000014 - momentum: 0.000000
2024-03-26 11:12:23,406 epoch 1 - iter 28/48 - loss 2.53932089 - time (sec): 15.95 - samples/sec: 1336.49 - lr: 0.000017 - momentum: 0.000000
2024-03-26 11:12:24,241 epoch 1 - iter 32/48 - loss 2.45998768 - time (sec): 16.78 - samples/sec: 1391.37 - lr: 0.000019 - momentum: 0.000000
2024-03-26 11:12:25,574 epoch 1 - iter 36/48 - loss 2.36289832 - time (sec): 18.11 - samples/sec: 1443.65 - lr: 0.000022 - momentum: 0.000000
2024-03-26 11:12:27,525 epoch 1 - iter 40/48 - loss 2.28678685 - time (sec): 20.07 - samples/sec: 1449.80 - lr: 0.000024 - momentum: 0.000000
2024-03-26 11:12:29,505 epoch 1 - iter 44/48 - loss 2.18927422 - time (sec): 22.05 - samples/sec: 1449.30 - lr: 0.000027 - momentum: 0.000000
2024-03-26 11:12:30,921 epoch 1 - iter 48/48 - loss 2.11069614 - time (sec): 23.46 - samples/sec: 1469.26 - lr: 0.000029 - momentum: 0.000000
2024-03-26 11:12:30,922 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:30,922 EPOCH 1 done: loss 2.1107 - lr: 0.000029
2024-03-26 11:12:31,865 DEV : loss 0.7406538128852844 - f1-score (micro avg) 0.4904
2024-03-26 11:12:31,866 saving best model
2024-03-26 11:12:32,158 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:33,483 epoch 2 - iter 4/48 - loss 1.05138817 - time (sec): 1.32 - samples/sec: 2191.17 - lr: 0.000030 - momentum: 0.000000
2024-03-26 11:12:35,353 epoch 2 - iter 8/48 - loss 0.87906687 - time (sec): 3.19 - samples/sec: 1909.15 - lr: 0.000030 - momentum: 0.000000
2024-03-26 11:12:38,859 epoch 2 - iter 12/48 - loss 0.75980156 - time (sec): 6.70 - samples/sec: 1519.05 - lr: 0.000029 - momentum: 0.000000
2024-03-26 11:12:41,406 epoch 2 - iter 16/48 - loss 0.70656249 - time (sec): 9.25 - samples/sec: 1440.33 - lr: 0.000029 - momentum: 0.000000
2024-03-26 11:12:44,203 epoch 2 - iter 20/48 - loss 0.65983774 - time (sec): 12.04 - samples/sec: 1379.31 - lr: 0.000029 - momentum: 0.000000
2024-03-26 11:12:46,198 epoch 2 - iter 24/48 - loss 0.62246019 - time (sec): 14.04 - samples/sec: 1373.20 - lr: 0.000028 - momentum: 0.000000
2024-03-26 11:12:48,002 epoch 2 - iter 28/48 - loss 0.61743183 - time (sec): 15.84 - samples/sec: 1384.37 - lr: 0.000028 - momentum: 0.000000
2024-03-26 11:12:49,808 epoch 2 - iter 32/48 - loss 0.60027304 - time (sec): 17.65 - samples/sec: 1394.07 - lr: 0.000028 - momentum: 0.000000
2024-03-26 11:12:51,737 epoch 2 - iter 36/48 - loss 0.58144132 - time (sec): 19.58 - samples/sec: 1401.19 - lr: 0.000028 - momentum: 0.000000
2024-03-26 11:12:52,763 epoch 2 - iter 40/48 - loss 0.56884272 - time (sec): 20.60 - samples/sec: 1448.84 - lr: 0.000027 - momentum: 0.000000
2024-03-26 11:12:54,236 epoch 2 - iter 44/48 - loss 0.56205915 - time (sec): 22.08 - samples/sec: 1468.31 - lr: 0.000027 - momentum: 0.000000
2024-03-26 11:12:55,816 epoch 2 - iter 48/48 - loss 0.54402847 - time (sec): 23.66 - samples/sec: 1457.17 - lr: 0.000027 - momentum: 0.000000
2024-03-26 11:12:55,816 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:55,816 EPOCH 2 done: loss 0.5440 - lr: 0.000027
2024-03-26 11:12:56,749 DEV : loss 0.3272944986820221 - f1-score (micro avg) 0.7565
2024-03-26 11:12:56,750 saving best model
2024-03-26 11:12:57,213 ----------------------------------------------------------------------------------------------------
2024-03-26 11:12:59,771 epoch 3 - iter 4/48 - loss 0.30544675 - time (sec): 2.56 - samples/sec: 1176.92 - lr: 0.000026 - momentum: 0.000000
2024-03-26 11:13:01,960 epoch 3 - iter 8/48 - loss 0.30379025 - time (sec): 4.75 - samples/sec: 1338.00 - lr: 0.000026 - momentum: 0.000000
2024-03-26 11:13:03,549 epoch 3 - iter 12/48 - loss 0.31891615 - time (sec): 6.33 - samples/sec: 1400.54 - lr: 0.000026 - momentum: 0.000000
2024-03-26 11:13:05,320 epoch 3 - iter 16/48 - loss 0.29901581 - time (sec): 8.11 - samples/sec: 1402.16 - lr: 0.000026 - momentum: 0.000000
2024-03-26 11:13:06,511 epoch 3 - iter 20/48 - loss 0.30884041 - time (sec): 9.30 - samples/sec: 1471.74 - lr: 0.000025 - momentum: 0.000000
2024-03-26 11:13:08,379 epoch 3 - iter 24/48 - loss 0.31810372 - time (sec): 11.16 - samples/sec: 1473.90 - lr: 0.000025 - momentum: 0.000000
2024-03-26 11:13:10,884 epoch 3 - iter 28/48 - loss 0.31428333 - time (sec): 13.67 - samples/sec: 1415.34 - lr: 0.000025 - momentum: 0.000000
2024-03-26 11:13:12,794 epoch 3 - iter 32/48 - loss 0.31267095 - time (sec): 15.58 - samples/sec: 1420.98 - lr: 0.000025 - momentum: 0.000000
2024-03-26 11:13:14,278 epoch 3 - iter 36/48 - loss 0.30305540 - time (sec): 17.06 - samples/sec: 1452.18 - lr: 0.000024 - momentum: 0.000000
2024-03-26 11:13:16,599 epoch 3 - iter 40/48 - loss 0.29326992 - time (sec): 19.38 - samples/sec: 1424.28 - lr: 0.000024 - momentum: 0.000000
2024-03-26 11:13:19,982 epoch 3 - iter 44/48 - loss 0.27254472 - time (sec): 22.77 - samples/sec: 1415.27 - lr: 0.000024 - momentum: 0.000000
2024-03-26 11:13:21,332 epoch 3 - iter 48/48 - loss 0.26917818 - time (sec): 24.12 - samples/sec: 1429.30 - lr: 0.000023 - momentum: 0.000000
2024-03-26 11:13:21,333 ----------------------------------------------------------------------------------------------------
2024-03-26 11:13:21,333 EPOCH 3 done: loss 0.2692 - lr: 0.000023
2024-03-26 11:13:22,276 DEV : loss 0.2584502100944519 - f1-score (micro avg) 0.8292
2024-03-26 11:13:22,279 saving best model
2024-03-26 11:13:22,739 ----------------------------------------------------------------------------------------------------
2024-03-26 11:13:24,351 epoch 4 - iter 4/48 - loss 0.29781570 - time (sec): 1.61 - samples/sec: 1582.51 - lr: 0.000023 - momentum: 0.000000
2024-03-26 11:13:26,710 epoch 4 - iter 8/48 - loss 0.22897059 - time (sec): 3.97 - samples/sec: 1509.70 - lr: 0.000023 - momentum: 0.000000
2024-03-26 11:13:27,969 epoch 4 - iter 12/48 - loss 0.21408042 - time (sec): 5.23 - samples/sec: 1598.39 - lr: 0.000023 - momentum: 0.000000
2024-03-26 11:13:30,257 epoch 4 - iter 16/48 - loss 0.20453825 - time (sec): 7.52 - samples/sec: 1499.82 - lr: 0.000022 - momentum: 0.000000
2024-03-26 11:13:32,884 epoch 4 - iter 20/48 - loss 0.19298299 - time (sec): 10.14 - samples/sec: 1378.36 - lr: 0.000022 - momentum: 0.000000
2024-03-26 11:13:35,007 epoch 4 - iter 24/48 - loss 0.20089208 - time (sec): 12.27 - samples/sec: 1372.28 - lr: 0.000022 - momentum: 0.000000
2024-03-26 11:13:37,157 epoch 4 - iter 28/48 - loss 0.19731722 - time (sec): 14.42 - samples/sec: 1379.88 - lr: 0.000022 - momentum: 0.000000
2024-03-26 11:13:39,826 epoch 4 - iter 32/48 - loss 0.19273634 - time (sec): 17.09 - samples/sec: 1349.67 - lr: 0.000021 - momentum: 0.000000
2024-03-26 11:13:42,677 epoch 4 - iter 36/48 - loss 0.18444787 - time (sec): 19.94 - samples/sec: 1341.70 - lr: 0.000021 - momentum: 0.000000
2024-03-26 11:13:44,459 epoch 4 - iter 40/48 - loss 0.17908396 - time (sec): 21.72 - samples/sec: 1339.53 - lr: 0.000021 - momentum: 0.000000
2024-03-26 11:13:46,546 epoch 4 - iter 44/48 - loss 0.17774704 - time (sec): 23.81 - samples/sec: 1340.95 - lr: 0.000020 - momentum: 0.000000
2024-03-26 11:13:48,279 epoch 4 - iter 48/48 - loss 0.17546508 - time (sec): 25.54 - samples/sec: 1349.77 - lr: 0.000020 - momentum: 0.000000
2024-03-26 11:13:48,279 ----------------------------------------------------------------------------------------------------
2024-03-26 11:13:48,279 EPOCH 4 done: loss 0.1755 - lr: 0.000020
2024-03-26 11:13:49,244 DEV : loss 0.2244909107685089 - f1-score (micro avg) 0.8723
2024-03-26 11:13:49,246 saving best model
2024-03-26 11:13:49,699 ----------------------------------------------------------------------------------------------------
2024-03-26 11:13:50,536 epoch 5 - iter 4/48 - loss 0.09331887 - time (sec): 0.84 - samples/sec: 2191.66 - lr: 0.000020 - momentum: 0.000000
2024-03-26 11:13:51,957 epoch 5 - iter 8/48 - loss 0.11095802 - time (sec): 2.26 - samples/sec: 1970.19 - lr: 0.000020 - momentum: 0.000000
2024-03-26 11:13:54,821 epoch 5 - iter 12/48 - loss 0.10642550 - time (sec): 5.12 - samples/sec: 1558.10 - lr: 0.000019 - momentum: 0.000000
2024-03-26 11:13:57,934 epoch 5 - iter 16/48 - loss 0.10476513 - time (sec): 8.23 - samples/sec: 1370.39 - lr: 0.000019 - momentum: 0.000000
2024-03-26 11:13:59,365 epoch 5 - iter 20/48 - loss 0.11438732 - time (sec): 9.67 - samples/sec: 1420.32 - lr: 0.000019 - momentum: 0.000000
2024-03-26 11:14:02,022 epoch 5 - iter 24/48 - loss 0.11257857 - time (sec): 12.32 - samples/sec: 1359.65 - lr: 0.000018 - momentum: 0.000000
2024-03-26 11:14:04,162 epoch 5 - iter 28/48 - loss 0.11104460 - time (sec): 14.46 - samples/sec: 1351.01 - lr: 0.000018 - momentum: 0.000000
2024-03-26 11:14:06,530 epoch 5 - iter 32/48 - loss 0.11618625 - time (sec): 16.83 - samples/sec: 1376.21 - lr: 0.000018 - momentum: 0.000000
2024-03-26 11:14:08,037 epoch 5 - iter 36/48 - loss 0.12098463 - time (sec): 18.34 - samples/sec: 1400.80 - lr: 0.000018 - momentum: 0.000000
2024-03-26 11:14:10,603 epoch 5 - iter 40/48 - loss 0.11599874 - time (sec): 20.90 - samples/sec: 1359.09 - lr: 0.000017 - momentum: 0.000000
2024-03-26 11:14:12,737 epoch 5 - iter 44/48 - loss 0.11697148 - time (sec): 23.04 - samples/sec: 1373.23 - lr: 0.000017 - momentum: 0.000000
2024-03-26 11:14:14,708 epoch 5 - iter 48/48 - loss 0.11878602 - time (sec): 25.01 - samples/sec: 1378.39 - lr: 0.000017 - momentum: 0.000000
2024-03-26 11:14:14,708 ----------------------------------------------------------------------------------------------------
2024-03-26 11:14:14,709 EPOCH 5 done: loss 0.1188 - lr: 0.000017
2024-03-26 11:14:15,653 DEV : loss 0.20735225081443787 - f1-score (micro avg) 0.8886
2024-03-26 11:14:15,654 saving best model
2024-03-26 11:14:16,137 ----------------------------------------------------------------------------------------------------
2024-03-26 11:14:17,840 epoch 6 - iter 4/48 - loss 0.11428082 - time (sec): 1.70 - samples/sec: 1463.60 - lr: 0.000017 - momentum: 0.000000
2024-03-26 11:14:20,285 epoch 6 - iter 8/48 - loss 0.10615835 - time (sec): 4.15 - samples/sec: 1543.50 - lr: 0.000016 - momentum: 0.000000
2024-03-26 11:14:22,266 epoch 6 - iter 12/48 - loss 0.09809695 - time (sec): 6.13 - samples/sec: 1478.37 - lr: 0.000016 - momentum: 0.000000
2024-03-26 11:14:24,383 epoch 6 - iter 16/48 - loss 0.09716603 - time (sec): 8.24 - samples/sec: 1470.82 - lr: 0.000016 - momentum: 0.000000
2024-03-26 11:14:27,150 epoch 6 - iter 20/48 - loss 0.09527628 - time (sec): 11.01 - samples/sec: 1450.83 - lr: 0.000015 - momentum: 0.000000
2024-03-26 11:14:28,723 epoch 6 - iter 24/48 - loss 0.10550915 - time (sec): 12.58 - samples/sec: 1470.85 - lr: 0.000015 - momentum: 0.000000
2024-03-26 11:14:30,147 epoch 6 - iter 28/48 - loss 0.10560312 - time (sec): 14.01 - samples/sec: 1475.45 - lr: 0.000015 - momentum: 0.000000
2024-03-26 11:14:31,356 epoch 6 - iter 32/48 - loss 0.10260877 - time (sec): 15.22 - samples/sec: 1495.04 - lr: 0.000015 - momentum: 0.000000
2024-03-26 11:14:32,882 epoch 6 - iter 36/48 - loss 0.09749042 - time (sec): 16.74 - samples/sec: 1524.96 - lr: 0.000014 - momentum: 0.000000
2024-03-26 11:14:34,858 epoch 6 - iter 40/48 - loss 0.09905047 - time (sec): 18.72 - samples/sec: 1513.84 - lr: 0.000014 - momentum: 0.000000
2024-03-26 11:14:37,140 epoch 6 - iter 44/48 - loss 0.09659837 - time (sec): 21.00 - samples/sec: 1531.03 - lr: 0.000014 - momentum: 0.000000
2024-03-26 11:14:38,883 epoch 6 - iter 48/48 - loss 0.09620050 - time (sec): 22.74 - samples/sec: 1515.63 - lr: 0.000014 - momentum: 0.000000
2024-03-26 11:14:38,883 ----------------------------------------------------------------------------------------------------
2024-03-26 11:14:38,883 EPOCH 6 done: loss 0.0962 - lr: 0.000014
2024-03-26 11:14:39,836 DEV : loss 0.18259809911251068 - f1-score (micro avg) 0.9103
2024-03-26 11:14:39,837 saving best model
2024-03-26 11:14:40,303 ----------------------------------------------------------------------------------------------------
2024-03-26 11:14:41,940 epoch 7 - iter 4/48 - loss 0.06204275 - time (sec): 1.64 - samples/sec: 1488.75 - lr: 0.000013 - momentum: 0.000000
2024-03-26 11:14:43,601 epoch 7 - iter 8/48 - loss 0.08162162 - time (sec): 3.30 - samples/sec: 1502.42 - lr: 0.000013 - momentum: 0.000000
2024-03-26 11:14:45,773 epoch 7 - iter 12/48 - loss 0.07584555 - time (sec): 5.47 - samples/sec: 1439.06 - lr: 0.000013 - momentum: 0.000000
2024-03-26 11:14:47,847 epoch 7 - iter 16/48 - loss 0.07316749 - time (sec): 7.54 - samples/sec: 1477.06 - lr: 0.000012 - momentum: 0.000000
2024-03-26 11:14:48,507 epoch 7 - iter 20/48 - loss 0.06894237 - time (sec): 8.20 - samples/sec: 1579.88 - lr: 0.000012 - momentum: 0.000000
2024-03-26 11:14:50,098 epoch 7 - iter 24/48 - loss 0.06789884 - time (sec): 9.79 - samples/sec: 1564.39 - lr: 0.000012 - momentum: 0.000000
2024-03-26 11:14:53,003 epoch 7 - iter 28/48 - loss 0.06686264 - time (sec): 12.70 - samples/sec: 1466.57 - lr: 0.000012 - momentum: 0.000000
2024-03-26 11:14:55,810 epoch 7 - iter 32/48 - loss 0.06577245 - time (sec): 15.51 - samples/sec: 1397.17 - lr: 0.000011 - momentum: 0.000000
2024-03-26 11:14:58,649 epoch 7 - iter 36/48 - loss 0.07040216 - time (sec): 18.34 - samples/sec: 1405.31 - lr: 0.000011 - momentum: 0.000000
2024-03-26 11:15:00,626 epoch 7 - iter 40/48 - loss 0.07466776 - time (sec): 20.32 - samples/sec: 1414.61 - lr: 0.000011 - momentum: 0.000000
2024-03-26 11:15:03,204 epoch 7 - iter 44/48 - loss 0.07480391 - time (sec): 22.90 - samples/sec: 1390.99 - lr: 0.000010 - momentum: 0.000000
2024-03-26 11:15:05,048 epoch 7 - iter 48/48 - loss 0.07371206 - time (sec): 24.74 - samples/sec: 1393.15 - lr: 0.000010 - momentum: 0.000000
2024-03-26 11:15:05,048 ----------------------------------------------------------------------------------------------------
2024-03-26 11:15:05,048 EPOCH 7 done: loss 0.0737 - lr: 0.000010
2024-03-26 11:15:06,019 DEV : loss 0.1877364218235016 - f1-score (micro avg) 0.9073
2024-03-26 11:15:06,020 ----------------------------------------------------------------------------------------------------
2024-03-26 11:15:08,718 epoch 8 - iter 4/48 - loss 0.07579462 - time (sec): 2.70 - samples/sec: 1224.41 - lr: 0.000010 - momentum: 0.000000
2024-03-26 11:15:10,839 epoch 8 - iter 8/48 - loss 0.05991518 - time (sec): 4.82 - samples/sec: 1217.82 - lr: 0.000010 - momentum: 0.000000
2024-03-26 11:15:14,025 epoch 8 - iter 12/48 - loss 0.05723449 - time (sec): 8.00 - samples/sec: 1210.78 - lr: 0.000009 - momentum: 0.000000
2024-03-26 11:15:16,001 epoch 8 - iter 16/48 - loss 0.06624845 - time (sec): 9.98 - samples/sec: 1236.69 - lr: 0.000009 - momentum: 0.000000
2024-03-26 11:15:17,487 epoch 8 - iter 20/48 - loss 0.06264195 - time (sec): 11.47 - samples/sec: 1280.67 - lr: 0.000009 - momentum: 0.000000
2024-03-26 11:15:20,048 epoch 8 - iter 24/48 - loss 0.06002540 - time (sec): 14.03 - samples/sec: 1272.13 - lr: 0.000009 - momentum: 0.000000
2024-03-26 11:15:21,828 epoch 8 - iter 28/48 - loss 0.06409926 - time (sec): 15.81 - samples/sec: 1308.14 - lr: 0.000008 - momentum: 0.000000
2024-03-26 11:15:23,457 epoch 8 - iter 32/48 - loss 0.06212462 - time (sec): 17.44 - samples/sec: 1334.23 - lr: 0.000008 - momentum: 0.000000
2024-03-26 11:15:24,748 epoch 8 - iter 36/48 - loss 0.06077673 - time (sec): 18.73 - samples/sec: 1366.08 - lr: 0.000008 - momentum: 0.000000
2024-03-26 11:15:27,154 epoch 8 - iter 40/48 - loss 0.06101414 - time (sec): 21.13 - samples/sec: 1371.91 - lr: 0.000007 - momentum: 0.000000
2024-03-26 11:15:30,041 epoch 8 - iter 44/48 - loss 0.05819661 - time (sec): 24.02 - samples/sec: 1341.24 - lr: 0.000007 - momentum: 0.000000
2024-03-26 11:15:32,117 epoch 8 - iter 48/48 - loss 0.05850858 - time (sec): 26.10 - samples/sec: 1320.96 - lr: 0.000007 - momentum: 0.000000
2024-03-26 11:15:32,117 ----------------------------------------------------------------------------------------------------
2024-03-26 11:15:32,117 EPOCH 8 done: loss 0.0585 - lr: 0.000007
2024-03-26 11:15:33,074 DEV : loss 0.18553169071674347 - f1-score (micro avg) 0.9269
2024-03-26 11:15:33,075 saving best model
2024-03-26 11:15:33,555 ----------------------------------------------------------------------------------------------------
2024-03-26 11:15:35,421 epoch 9 - iter 4/48 - loss 0.05732311 - time (sec): 1.86 - samples/sec: 1525.50 - lr: 0.000007 - momentum: 0.000000
2024-03-26 11:15:37,946 epoch 9 - iter 8/48 - loss 0.04919745 - time (sec): 4.39 - samples/sec: 1397.05 - lr: 0.000006 - momentum: 0.000000
2024-03-26 11:15:40,357 epoch 9 - iter 12/48 - loss 0.06047389 - time (sec): 6.80 - samples/sec: 1357.42 - lr: 0.000006 - momentum: 0.000000
2024-03-26 11:15:42,457 epoch 9 - iter 16/48 - loss 0.06071694 - time (sec): 8.90 - samples/sec: 1358.99 - lr: 0.000006 - momentum: 0.000000
2024-03-26 11:15:43,964 epoch 9 - iter 20/48 - loss 0.05316476 - time (sec): 10.41 - samples/sec: 1416.10 - lr: 0.000006 - momentum: 0.000000
2024-03-26 11:15:45,205 epoch 9 - iter 24/48 - loss 0.05009929 - time (sec): 11.65 - samples/sec: 1462.49 - lr: 0.000005 - momentum: 0.000000
2024-03-26 11:15:46,906 epoch 9 - iter 28/48 - loss 0.04927559 - time (sec): 13.35 - samples/sec: 1481.52 - lr: 0.000005 - momentum: 0.000000
2024-03-26 11:15:49,258 epoch 9 - iter 32/48 - loss 0.05509981 - time (sec): 15.70 - samples/sec: 1464.55 - lr: 0.000005 - momentum: 0.000000
2024-03-26 11:15:51,996 epoch 9 - iter 36/48 - loss 0.05508145 - time (sec): 18.44 - samples/sec: 1416.65 - lr: 0.000004 - momentum: 0.000000
2024-03-26 11:15:54,978 epoch 9 - iter 40/48 - loss 0.05449908 - time (sec): 21.42 - samples/sec: 1375.91 - lr: 0.000004 - momentum: 0.000000
2024-03-26 11:15:56,846 epoch 9 - iter 44/48 - loss 0.05400296 - time (sec): 23.29 - samples/sec: 1390.32 - lr: 0.000004 - momentum: 0.000000
2024-03-26 11:15:57,905 epoch 9 - iter 48/48 - loss 0.05389343 - time (sec): 24.35 - samples/sec: 1415.78 - lr: 0.000004 - momentum: 0.000000
2024-03-26 11:15:57,905 ----------------------------------------------------------------------------------------------------
2024-03-26 11:15:57,905 EPOCH 9 done: loss 0.0539 - lr: 0.000004
2024-03-26 11:15:58,855 DEV : loss 0.1756318360567093 - f1-score (micro avg) 0.9235
2024-03-26 11:15:58,857 ----------------------------------------------------------------------------------------------------
2024-03-26 11:16:01,220 epoch 10 - iter 4/48 - loss 0.02756730 - time (sec): 2.36 - samples/sec: 1397.98 - lr: 0.000003 - momentum: 0.000000
2024-03-26 11:16:03,396 epoch 10 - iter 8/48 - loss 0.03411980 - time (sec): 4.54 - samples/sec: 1361.41 - lr: 0.000003 - momentum: 0.000000
2024-03-26 11:16:05,325 epoch 10 - iter 12/48 - loss 0.03366891 - time (sec): 6.47 - samples/sec: 1364.60 - lr: 0.000003 - momentum: 0.000000
2024-03-26 11:16:06,556 epoch 10 - iter 16/48 - loss 0.03593148 - time (sec): 7.70 - samples/sec: 1431.57 - lr: 0.000002 - momentum: 0.000000
2024-03-26 11:16:08,558 epoch 10 - iter 20/48 - loss 0.04270731 - time (sec): 9.70 - samples/sec: 1413.16 - lr: 0.000002 - momentum: 0.000000
2024-03-26 11:16:10,903 epoch 10 - iter 24/48 - loss 0.05028433 - time (sec): 12.04 - samples/sec: 1378.76 - lr: 0.000002 - momentum: 0.000000
2024-03-26 11:16:11,805 epoch 10 - iter 28/48 - loss 0.05128601 - time (sec): 12.95 - samples/sec: 1451.20 - lr: 0.000002 - momentum: 0.000000
2024-03-26 11:16:13,125 epoch 10 - iter 32/48 - loss 0.04953866 - time (sec): 14.27 - samples/sec: 1488.71 - lr: 0.000001 - momentum: 0.000000
2024-03-26 11:16:15,943 epoch 10 - iter 36/48 - loss 0.04689467 - time (sec): 17.08 - samples/sec: 1445.36 - lr: 0.000001 - momentum: 0.000000
2024-03-26 11:16:18,324 epoch 10 - iter 40/48 - loss 0.04744759 - time (sec): 19.47 - samples/sec: 1477.13 - lr: 0.000001 - momentum: 0.000000
2024-03-26 11:16:20,948 epoch 10 - iter 44/48 - loss 0.04670001 - time (sec): 22.09 - samples/sec: 1451.98 - lr: 0.000001 - momentum: 0.000000
2024-03-26 11:16:22,941 epoch 10 - iter 48/48 - loss 0.04566351 - time (sec): 24.08 - samples/sec: 1431.38 - lr: 0.000000 - momentum: 0.000000
2024-03-26 11:16:22,942 ----------------------------------------------------------------------------------------------------
2024-03-26 11:16:22,942 EPOCH 10 done: loss 0.0457 - lr: 0.000000
2024-03-26 11:16:23,896 DEV : loss 0.18053248524665833 - f1-score (micro avg) 0.9227
2024-03-26 11:16:24,184 ----------------------------------------------------------------------------------------------------
2024-03-26 11:16:24,185 Loading model from best epoch ...
2024-03-26 11:16:25,052 SequenceTagger predicts: Dictionary with 17 tags: O, S-Unternehmen, B-Unternehmen, E-Unternehmen, I-Unternehmen, S-Auslagerung, B-Auslagerung, E-Auslagerung, I-Auslagerung, S-Ort, B-Ort, E-Ort, I-Ort, S-Software, B-Software, E-Software, I-Software
2024-03-26 11:16:25,801
Results:
- F-score (micro) 0.8963
- F-score (macro) 0.6819
- Accuracy 0.8144
By class:
precision recall f1-score support
Unternehmen 0.8859 0.8759 0.8809 266
Auslagerung 0.8577 0.8956 0.8762 249
Ort 0.9565 0.9851 0.9706 134
Software 0.0000 0.0000 0.0000 0
micro avg 0.8869 0.9060 0.8963 649
macro avg 0.6750 0.6891 0.6819 649
weighted avg 0.8897 0.9060 0.8976 649
2024-03-26 11:16:25,801 ----------------------------------------------------------------------------------------------------
|