Added more information to README.
#1
by
NyistMilan
- opened
README.md
CHANGED
@@ -18,4 +18,32 @@ license: apache-2.0
|
|
18 |
|
19 |
# Model Card for mT5-base-HunSum-1
|
20 |
|
21 |
-
The mT5-base-HunSum-1 is a Hungarian summarization model, which was trained on the SZTAKI-HLT/HunSum-1 dataset.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 |
|
19 |
# Model Card for mT5-base-HunSum-1
|
20 |
|
21 |
+
The mT5-base-HunSum-1 is a Hungarian abstractive summarization model, which was trained on the [SZTAKI-HLT/HunSum-1 dataset](https://huggingface.co/datasets/SZTAKI-HLT/HunSum-1).
|
22 |
+
The model is based on [google/mt5-base](https://huggingface.co/google/mt5-base).
|
23 |
+
|
24 |
+
## Intended uses & limitations
|
25 |
+
|
26 |
+
- **Model type:** Text Summarization
|
27 |
+
- **Language(s) (NLP):** Hungarian
|
28 |
+
- **Resource(s) for more information:**
|
29 |
+
- [GitHub Repo](https://github.com/dorinapetra/summarization)
|
30 |
+
|
31 |
+
## Parameters
|
32 |
+
|
33 |
+
- **Batch Size:** 12
|
34 |
+
- **Learning Rate:** 5e-5
|
35 |
+
- **Weight Decay:** 0.01
|
36 |
+
- **Warmup Steps:** 3000
|
37 |
+
- **Epochs:** 10
|
38 |
+
- **no_repeat_ngram_size:** 3
|
39 |
+
- **num_beams:** 5
|
40 |
+
- **early_stopping:** False
|
41 |
+
- **encoder_no_repeat_ngram_size:** 4
|
42 |
+
|
43 |
+
## Results
|
44 |
+
|
45 |
+
| Metric | Value |
|
46 |
+
| :------------ | :------------------------------------------ |
|
47 |
+
| ROUGE-1 | 37.70 |
|
48 |
+
| ROUGE-2 | 11.22 |
|
49 |
+
| ROUGE-L | 24.37 |
|