metadata
language:
- hu
pipeline_tag: summarization
inference:
parameters:
num_beams: 5
length_penalty: 2
max_length: 128
encoder_no_repeat_ngram_size: 4
no_repeat_ngram_size: 3
datasets:
- SZTAKI-HLT/HunSum-1
metrics:
- rouge
Model Card for mT5-small-HunSum-1
The mT5-small-HunSum-1 is a Hungarian abstractive summarization model, which was trained on the SZTAKI-HLT/HunSum-1 dataset. The model is a fine-tuning of google/mt5-small.
Intended uses & limitations
- Model type: Text Summarization
- Language(s) (NLP): Hungarian
- Resource(s) for more information:
Parameters
- Batch Size: 16
- Learning Rate: 5e-5
- Weight Decay: 0.01
- Warmup Steps: 3000
- Epochs: 10
- no_repeat_ngram_size: 3
- num_beams: 5
- early_stopping: False
- encoder_no_repeat_ngram_size: 4
Results
Metric | Value |
---|---|
ROUGE-1 | 36.49 |
ROUGE-2 | 9.50 |
ROUGE-3 | 23.48 |