File size: 15,370 Bytes
62cd4ef 0ac09fe 62cd4ef 0ac09fe 62cd4ef 0ac09fe 62cd4ef 0ac09fe 62cd4ef |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 |
2022-03-27 02:06:28,163 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,170 Model: "SequenceTagger(
(embeddings): StackedEmbeddings(
(list_embedding_0): WordEmbeddings('fa')
(list_embedding_1): FlairEmbeddings(
(lm): LanguageModel(
(drop): Dropout(p=0.1, inplace=False)
(encoder): Embedding(5105, 100)
(rnn): LSTM(100, 2048)
(decoder): Linear(in_features=2048, out_features=5105, bias=True)
)
)
(list_embedding_2): FlairEmbeddings(
(lm): LanguageModel(
(drop): Dropout(p=0.1, inplace=False)
(encoder): Embedding(5105, 100)
(rnn): LSTM(100, 2048)
(decoder): Linear(in_features=2048, out_features=5105, bias=True)
)
)
)
(word_dropout): WordDropout(p=0.05)
(locked_dropout): LockedDropout(p=0.5)
(embedding2nn): Linear(in_features=4396, out_features=4396, bias=True)
(rnn): LSTM(4396, 256, batch_first=True, bidirectional=True)
(linear): Linear(in_features=512, out_features=17, bias=True)
(beta): 1.0
(weights): None
(weight_tensor) None
)"
2022-03-27 02:06:28,173 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,178 Corpus: "Corpus: 23060 train + 4070 dev + 4150 test sentences"
2022-03-27 02:06:28,184 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,189 Parameters:
2022-03-27 02:06:28,194 - learning_rate: "0.1"
2022-03-27 02:06:28,198 - mini_batch_size: "4"
2022-03-27 02:06:28,203 - patience: "3"
2022-03-27 02:06:28,207 - anneal_factor: "0.5"
2022-03-27 02:06:28,212 - max_epochs: "2"
2022-03-27 02:06:28,217 - shuffle: "True"
2022-03-27 02:06:28,219 - train_with_dev: "False"
2022-03-27 02:06:28,225 - batch_growth_annealing: "False"
2022-03-27 02:06:28,227 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,231 Model training base path: "/content/gdrive/MyDrive/project/data/ner/model2"
2022-03-27 02:06:28,238 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,242 Device: cuda:0
2022-03-27 02:06:28,244 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,247 Embeddings storage mode: none
2022-03-27 02:06:30,459 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:30,469 Testing using last state of model ...
2022-03-27 02:12:42,501 0.8475 0.7185 0.7777 0.647
2022-03-27 02:12:42,506
Results:
- F-score (micro) 0.7777
- F-score (macro) 0.7833
- Accuracy 0.647
By class:
precision recall f1-score support
LOC 0.8821 0.7877 0.8322 4083
ORG 0.8088 0.6105 0.6958 3166
PER 0.8381 0.7443 0.7884 2741
DAT 0.8298 0.6487 0.7282 1150
MON 0.9377 0.8852 0.9107 357
TIM 0.5741 0.5602 0.5671 166
PCT 0.9737 0.9487 0.9610 156
micro avg 0.8475 0.7185 0.7777 11819
macro avg 0.8349 0.7408 0.7833 11819
weighted avg 0.8457 0.7185 0.7757 11819
samples avg 0.6470 0.6470 0.6470 11819
2022-03-27 02:12:42,508 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,163 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,170 Model: "SequenceTagger(
(embeddings): StackedEmbeddings(
(list_embedding_0): WordEmbeddings('fa')
(list_embedding_1): FlairEmbeddings(
(lm): LanguageModel(
(drop): Dropout(p=0.1, inplace=False)
(encoder): Embedding(5105, 100)
(rnn): LSTM(100, 2048)
(decoder): Linear(in_features=2048, out_features=5105, bias=True)
)
)
(list_embedding_2): FlairEmbeddings(
(lm): LanguageModel(
(drop): Dropout(p=0.1, inplace=False)
(encoder): Embedding(5105, 100)
(rnn): LSTM(100, 2048)
(decoder): Linear(in_features=2048, out_features=5105, bias=True)
)
)
)
(word_dropout): WordDropout(p=0.05)
(locked_dropout): LockedDropout(p=0.5)
(embedding2nn): Linear(in_features=4396, out_features=4396, bias=True)
(rnn): LSTM(4396, 256, batch_first=True, bidirectional=True)
(linear): Linear(in_features=512, out_features=17, bias=True)
(beta): 1.0
(weights): None
(weight_tensor) None
)"
2022-03-27 02:06:28,173 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,178 Corpus: "Corpus: 23060 train + 4070 dev + 4150 test sentences"
2022-03-27 02:06:28,184 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,189 Parameters:
2022-03-27 02:06:28,194 - learning_rate: "0.1"
2022-03-27 02:06:28,198 - mini_batch_size: "4"
2022-03-27 02:06:28,203 - patience: "3"
2022-03-27 02:06:28,207 - anneal_factor: "0.5"
2022-03-27 02:06:28,212 - max_epochs: "2"
2022-03-27 02:06:28,217 - shuffle: "True"
2022-03-27 02:06:28,219 - train_with_dev: "False"
2022-03-27 02:06:28,225 - batch_growth_annealing: "False"
2022-03-27 02:06:28,227 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,231 Model training base path: "/content/gdrive/MyDrive/project/data/ner/model2"
2022-03-27 02:06:28,238 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,242 Device: cuda:0
2022-03-27 02:06:28,244 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:28,247 Embeddings storage mode: none
2022-03-27 02:06:30,459 ----------------------------------------------------------------------------------------------------
2022-03-27 02:06:30,469 Testing using last state of model ...
2022-03-27 02:12:42,501 0.8475 0.7185 0.7777 0.647
2022-03-27 02:12:42,506
Results:
- F-score (micro) 0.7777
- F-score (macro) 0.7833
- Accuracy 0.647
By class:
precision recall f1-score support
LOC 0.8821 0.7877 0.8322 4083
ORG 0.8088 0.6105 0.6958 3166
PER 0.8381 0.7443 0.7884 2741
DAT 0.8298 0.6487 0.7282 1150
MON 0.9377 0.8852 0.9107 357
TIM 0.5741 0.5602 0.5671 166
PCT 0.9737 0.9487 0.9610 156
micro avg 0.8475 0.7185 0.7777 11819
macro avg 0.8349 0.7408 0.7833 11819
weighted avg 0.8457 0.7185 0.7757 11819
samples avg 0.6470 0.6470 0.6470 11819
2022-03-27 02:12:42,508 ----------------------------------------------------------------------------------------------------
|