megantosh's picture
Update README.md
ffd150b
|
raw
history blame
1.9 kB
---
language: ar
license: apache-2.0
datasets:
- AQMAR
- ANERcorp
thumbnail: https://raw.githubusercontent.com/JetRunner/BERT-of-Theseus/master/bert-of-theseus.png
embeddings:
- GloVe
- Flair
tags:
- Text Classification
metrics:
- f1
---
# Arabic NER Model using Flair Embeddings
Training was conducted over 94 epochs, using a linear decaying learning rate of 2e-05, starting from 0.225 and a batch size of 32 with GloVe and Flair forward and backward embeddings.
Results:
- F1-score (micro) 0.8666
- F1-score (macro) 0.8488
| | tp | fp | fn | precision | recall | class-F1 |
|------|-----|----|----|-----------|--------|----------|
| LOC | 539 | 51 | 68 | 0.9136 | 0.8880 | 0.9006 |
| MISC | 408 | 57 | 89 | 0.8774 | 0.8209 | 0.8482 |
| ORG | 167 | 43 | 64 | 0.7952 | 0.7229 | 0.7574 |
| PER | 501 | 65 | 60 | 0.8852 | 0.8930 | 0.8891 |
---
```
2020-10-27 12:05:47,801 Model: "SequenceTagger(
(embeddings): StackedEmbeddings(
(list_embedding_0): WordEmbeddings('glove')
(list_embedding_1): FlairEmbeddings(
(lm): LanguageModel(
(drop): Dropout(p=0.1, inplace=False)
(encoder): Embedding(7125, 100)
(rnn): LSTM(100, 2048)
(decoder): Linear(in_features=2048, out_features=7125, bias=True)
)
)
(list_embedding_2): FlairEmbeddings(
(lm): LanguageModel(
(drop): Dropout(p=0.1, inplace=False)
(encoder): Embedding(7125, 100)
(rnn): LSTM(100, 2048)
(decoder): Linear(in_features=2048, out_features=7125, bias=True)
)
)
)
(word_dropout): WordDropout(p=0.05)
(locked_dropout): LockedDropout(p=0.5)
(embedding2nn): Linear(in_features=4196, out_features=4196, bias=True)
(rnn): LSTM(4196, 256, batch_first=True, bidirectional=True)
(linear): Linear(in_features=512, out_features=15, bias=True)
(beta): 1.0
(weights): None
(weight_tensor) None
```