Prazzwal07 commited on
Commit
97ddf9d
·
verified ·
1 Parent(s): 6c8370a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -17
README.md CHANGED
@@ -244,27 +244,19 @@ RoBERTa architecture with 12 transformer layers, hidden size of 768, 12 attentio
244
 
245
  ---
246
 
247
- ## Citation
248
-
249
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
250
-
251
- **BibTeX:**
252
-
253
  ```
254
- @misc{IRIISNEPAL_RoBERTa_Nepali_110M,
255
- title = {Development of Pre-trained Transformer-based Models for the Nepali Language},
256
- author = {Thapa, Prajwal and Nyachhyon, Jinu and Sharma, Mridul and Bal, Bal Krishna},
257
- year = {2024},
258
- note = {Submitted to COLING 2025}
 
 
 
259
  }
260
  ```
261
 
262
- **APA:**
263
-
264
- ```
265
- Thapa, P., Nyachhyon, J., Sharma, M., & Bal, B. K. (2024). Development of pre-trained transformer-based models for the Nepali language. Manuscript submitted for publication to COLING 2025.
266
- ```
267
-
268
  ---
269
 
270
  ## Model Card Contact
 
244
 
245
  ---
246
 
247
+ ## Citation
 
 
 
 
 
248
  ```
249
+ @misc{thapa2024developmentpretrainedtransformerbasedmodels,
250
+ title={Development of Pre-Trained Transformer-based Models for the Nepali Language},
251
+ author={Prajwal Thapa and Jinu Nyachhyon and Mridul Sharma and Bal Krishna Bal},
252
+ year={2024},
253
+ eprint={2411.15734},
254
+ archivePrefix={arXiv},
255
+ primaryClass={cs.CL},
256
+ url={https://arxiv.org/abs/2411.15734},
257
  }
258
  ```
259
 
 
 
 
 
 
 
260
  ---
261
 
262
  ## Model Card Contact