File size: 501 Bytes
047242b eba6dcf 70840cc 047242b 930d8c2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
language: zh-tw
datasets: DRCD
tasks: Question Answering
---
# BERT DRCD 384
This model is a fine-tune checkpoint of [bert-base-chinese](https://huggingface.co/bert-base-chinese), fine-tuned on DRCD dataset.
This model reaches a F1 score of 86.
This model reaches a EM score of 83.
Training Arguments:
- length: 384
- stride: 128
- learning_rate: 3e-5
- batch_size: 10
- epoch: 3
[Colab for detailed](https://colab.research.google.com/drive/1kZv7ZRmvUdCKEhQg8MBrKljGWvV2X3CP?usp=sharing) |