|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- SZTAKI-HLT/HunSum-2-abstractive |
|
language: |
|
- hu |
|
metrics: |
|
- rouge |
|
pipeline_tag: summarization |
|
inference: |
|
parameters: |
|
num_beams: 5 |
|
length_penalty: 2 |
|
max_length: 128 |
|
encoder_no_repeat_ngram_size: 4 |
|
no_repeat_ngram_size: 3 |
|
base_model: |
|
- google/mt5-base |
|
--- |
|
|
|
# Model Card for mT5-base-HunSum-2 |
|
|
|
The mT5-base-HunSum-2 is a Hungarian abstractive summarization model, which was trained on the [SZTAKI-HLT/HunSum-2-abstractive dataset](https://huggingface.co/datasets/SZTAKI-HLT/HunSum-2-abstractive). |
|
The model is based on [google/mt5-base](https://huggingface.co/google/mt5-base). |
|
|
|
## Intended uses & limitations |
|
|
|
- **Model type:** Text Summarization |
|
- **Language(s) (NLP):** Hungarian |
|
- **Resource(s) for more information:** |
|
- [GitHub Repo](https://github.com/botondbarta/HunSum) |
|
|
|
## Parameters |
|
|
|
- **Batch Size:** 12 |
|
- **Learning Rate:** 5e-5 |
|
- **Weight Decay:** 0.01 |
|
- **Warmup Steps:** 3000 |
|
- **Epochs:** 10 |
|
- **no_repeat_ngram_size:** 3 |
|
- **num_beams:** 5 |
|
- **early_stopping:** False |
|
- **encoder_no_repeat_ngram_size:** 4 |
|
|
|
## Results |
|
|
|
| Metric | Value | |
|
| :------------ | :------------------------------------------ | |
|
| ROUGE-1 | 40.06 | |
|
| ROUGE-2 | 12.67 | |
|
| ROUGE-L | 25.93 | |