ryo0634 commited on
Commit
cc5e9e8
1 Parent(s): 39eaaef

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -6
README.md CHANGED
@@ -50,10 +50,12 @@ The model was initialized with the weights of XLM-RoBERTa(base) and trained usin
50
  If you find mLUKE useful for your work, please cite the following paper:
51
 
52
  ```latex
53
- @inproceedings{ri2021mluke,
54
- title={mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models},
55
- author={Ryokan Ri, Ikuya Yamada, Yoshimasa Tsuruoka},
56
- booktitle={arXiv},
57
- year={2021}
58
- }
 
 
59
  ```
 
50
  If you find mLUKE useful for your work, please cite the following paper:
51
 
52
  ```latex
53
+ @inproceedings{ri-etal-2022-mluke,
54
+ title = "m{LUKE}: {T}he Power of Entity Representations in Multilingual Pretrained Language Models",
55
+ author = "Ri, Ryokan and
56
+ Yamada, Ikuya and
57
+ Tsuruoka, Yoshimasa",
58
+ booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
59
+ year = "2022",
60
+ url = "https://aclanthology.org/2022.acl-long.505",
61
  ```