xlmr-large-all-CLS-P / test_eval_en.txt
HHansi's picture
Upload folder using huggingface_hub
6ae7e12 verified
raw
history blame
1.34 kB
Default classification report:
precision recall f1-score support
F 0.8917 0.8560 0.8735 500
T 0.8615 0.8960 0.8784 500
accuracy 0.8760 1000
macro avg 0.8766 0.8760 0.8760 1000
weighted avg 0.8766 0.8760 0.8760 1000
ADJ
Accuracy = 0.8611111111111112
Weighted Recall = 0.8611111111111112
Weighted Precision = 0.8630125010927528
Weighted F1 = 0.8605431137076708
Macro Recall = 0.858359133126935
Macro Precision = 0.8642800944138473
Macro F1 = 0.8597857838364169
ADV
Accuracy = 0.8333333333333334
Weighted Recall = 0.8333333333333334
Weighted Precision = 0.8371040723981902
Weighted F1 = 0.8342857142857143
Macro Recall = 0.8333333333333334
Macro Precision = 0.8257918552036199
Macro F1 = 0.8285714285714285
NOUN
Accuracy = 0.8806818181818182
Weighted Recall = 0.8806818181818182
Weighted Precision = 0.8831406377909783
Weighted F1 = 0.8805108267244117
Macro Recall = 0.8808307626085086
Macro Precision = 0.8830235511429231
Macro F1 = 0.8805271116251171
VERB
Accuracy = 0.8791946308724832
Weighted Recall = 0.8791946308724832
Weighted Precision = 0.880290915661562
Weighted F1 = 0.879107505070994
Macro Recall = 0.8791946308724832
Macro Precision = 0.8802909156615621
Macro F1 = 0.8791075050709939