File size: 1,006 Bytes
bf25190 6f18d3c e7057f7 a69ce23 09ea2df 3a232ff 09ea2df a69ce23 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
---
license: apache-2.0
datasets:
- xquad
language:
- multilingual
library_name: transformers
tags:
- cross-lingual
- exctractive-question-answering
metrics:
- f1
- exact_match
---
# Description
Best-performing "mBERT-qa-en, skd, mAP@k" model from the paper [Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation](https://arxiv.org/abs/2309.17134).
Check the official [GitHub repository](https://github.com/ccasimiro88/self-distillation-gxlt-qa) to access the code used to implement the methods in the paper.
**More info coming soon!**
# How to Cite
To cite our work use the following BibTex:
```
@misc{carrino2023promoting,
title={Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation},
author={Casimiro Pio Carrino and Carlos Escolano and José A. R. Fonollosa},
year={2023},
eprint={2309.17134},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |