language: zh-tw | |
datasets: DRCD | |
tasks: Question Answering | |
# BERT DRCD 384 | |
This model is a fine-tune checkpoint of [bert-base-chinese](https://huggingface.co/bert-base-chinese), fine-tuned on DRCD dataset. | |
This model reaches a F1 score of 86. | |
This model reaches a EM score of 83. | |
Training Arguments: | |
- length: 384 | |
- stride: 128 | |
- learning_rate: 3e-5 | |
- batch_size: 10 | |
- epoch: 3 | |
[Colab for detailed](https://colab.research.google.com/drive/1kZv7ZRmvUdCKEhQg8MBrKljGWvV2X3CP?usp=sharing) |