cecilemacaire
commited on
Commit
•
d915fbd
1
Parent(s):
f815f2e
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,59 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- fr
|
5 |
+
library_name: transformers
|
6 |
+
tags:
|
7 |
+
- t5
|
8 |
+
- orfeo
|
9 |
+
- pytorch
|
10 |
+
- pictograms
|
11 |
+
- translation
|
12 |
+
metrics:
|
13 |
+
- bleu
|
14 |
+
---
|
15 |
+
|
16 |
+
# t2p-t5-large-orféo
|
17 |
+
|
18 |
+
T2P-t5-large-orféo is a text-to-pictograms translation model built by fine-tuning the [t5-large](https://huggingface.co/google-t5/t5-large) model on a dataset of pairs of transcriptions / pictogram token sequence (each token is linked to a pictogram image from [ARASAAC](https://arasaac.org/)).
|
19 |
+
|
20 |
+
## Training details
|
21 |
+
|
22 |
+
### Datasets
|
23 |
+
|
24 |
+
### Parameters
|
25 |
+
|
26 |
+
### Evaluation
|
27 |
+
|
28 |
+
### Results
|
29 |
+
|
30 |
+
### Environmental Impact
|
31 |
+
|
32 |
+
|
33 |
+
|
34 |
+
## Using t2p-t5-large-orféo model with HuggingFace transformers
|
35 |
+
|
36 |
+
```python
|
37 |
+
```
|
38 |
+
|
39 |
+
- **Language(s):** French
|
40 |
+
- **License:** Apache-2.0
|
41 |
+
- **Developed by:** Cécile Macaire
|
42 |
+
- **Funded by**
|
43 |
+
- GENCI-IDRIS (Grant 2023-AD011013625R1)
|
44 |
+
- PROPICTO ANR-20-CE93-0005
|
45 |
+
- **Authors**
|
46 |
+
- Cécile Macaire
|
47 |
+
- Chloé Dion
|
48 |
+
- Emmanuelle Esperança-Rodier
|
49 |
+
- Benjamin Lecouteux
|
50 |
+
- Didier Schwab
|
51 |
+
|
52 |
+
|
53 |
+
## Citation
|
54 |
+
|
55 |
+
If you use this model for your own research work, please cite as follows:
|
56 |
+
|
57 |
+
```bibtex
|
58 |
+
|
59 |
+
```
|