PyTorch
Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Overview

JABER (Junior Arabic BERt) is a 12-layer Arabic pretrained Language Model. JABER obtained rank one on ALUE leaderboard at 01/09/2021. This model is only compatible with the code in this github repo (not supported by the Transformers library)

Citation

Please cite the following paper when using our code and model:

@misc{ghaddar2021jaber,
      title={JABER: Junior Arabic BERt}, 
      author={Abbas Ghaddar and Yimeng Wu and Ahmad Rashid and Khalil Bibi and Mehdi Rezagholizadeh and Chao Xing and Yasheng Wang and Duan Xinyu and Zhefeng Wang and Baoxing Huai and Xin Jiang and Qun Liu and Philippe Langlais},
      year={2021},
      eprint={2112.04329},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .