|
--- |
|
language: multilingual |
|
|
|
datasets: wikipedia |
|
|
|
license: apache-2.0 |
|
|
|
widget: |
|
- text: "Google generated 46 billion [MASK] in revenue." |
|
- text: "Paris is the capital of [MASK]." |
|
- text: "Algiers is the largest city in [MASK]." |
|
- text: "تقع سويسرا في [MASK] أوروبا" |
|
- text: "إسمي محمد وأسكن في [MASK]." |
|
--- |
|
|
|
# bert-base-en-ar-cased |
|
|
|
We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) that handle a custom number of languages. |
|
|
|
Unlike [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy. |
|
|
|
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf). |
|
|
|
## How to use |
|
|
|
```python |
|
from transformers import AutoTokenizer, AutoModel |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-ar-cased") |
|
model = AutoModel.from_pretrained("Geotrend/bert-base-en-ar-cased") |
|
|
|
``` |
|
|
|
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers). |
|
|
|
### How to cite |
|
|
|
```bibtex |
|
@inproceedings{smallermbert, |
|
title={Load What You Need: Smaller Versions of Mutlilingual BERT}, |
|
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire}, |
|
booktitle={SustaiNLP / EMNLP}, |
|
year={2020} |
|
} |
|
``` |
|
|
|
## Contact |
|
|
|
Please contact [email protected] for any question, feedback or request. |
|
|