update
Browse files
README.md
CHANGED
@@ -4,14 +4,20 @@ base_model: xlm-roberta-large
|
|
4 |
tags:
|
5 |
- generated_from_trainer
|
6 |
model-index:
|
7 |
-
- name:
|
8 |
results: []
|
|
|
|
|
|
|
|
|
|
|
|
|
9 |
---
|
10 |
|
11 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
should probably proofread and complete it, then remove this comment. -->
|
13 |
|
14 |
-
#
|
15 |
|
16 |
This model is a fine-tuned version of [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
@@ -52,3 +58,12 @@ The following hyperparameters were used during training:
|
|
52 |
- Pytorch 2.0.1+cu117
|
53 |
- Datasets 2.14.4
|
54 |
- Tokenizers 0.13.3
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
tags:
|
5 |
- generated_from_trainer
|
6 |
model-index:
|
7 |
+
- name: EthioLLM-l-70K
|
8 |
results: []
|
9 |
+
language:
|
10 |
+
- am
|
11 |
+
- om
|
12 |
+
- so
|
13 |
+
- ti
|
14 |
+
- gez
|
15 |
---
|
16 |
|
17 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
18 |
should probably proofread and complete it, then remove this comment. -->
|
19 |
|
20 |
+
# EthioLLM-l-70K
|
21 |
|
22 |
This model is a fine-tuned version of [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) on an unknown dataset.
|
23 |
It achieves the following results on the evaluation set:
|
|
|
58 |
- Pytorch 2.0.1+cu117
|
59 |
- Datasets 2.14.4
|
60 |
- Tokenizers 0.13.3
|
61 |
+
|
62 |
+
### Citation Information
|
63 |
+
|
64 |
+
@article{tonja2024ethiollm,
|
65 |
+
title={EthioLLM: Multilingual Large Language Models for Ethiopian Languages with Task Evaluation},
|
66 |
+
author={Tonja, Atnafu Lambebo and Azime, Israel Abebe and Belay, Tadesse Destaw and Yigezu, Mesay Gemeda and Mehamed, Moges Ahmed and Ayele, Abinew Ali and Jibril, Ebrahim Chekol and Woldeyohannis, Michael Melese and Kolesnikova, Olga and Slusallek, Philipp and others},
|
67 |
+
journal={arXiv preprint arXiv:2403.13737},
|
68 |
+
year={2024}
|
69 |
+
}
|