mt5-small-multi-news
This model is a fine-tuned version of google/mt5-small on the multi_news dataset. It achieves the following results on the evaluation set:
- Loss: 3.2170
- Rouge1: 22.03
- Rouge2: 6.95
- Rougel: 18.41
- Rougelsum: 18.72
Intended uses & limitations
Text summarization is the inteded use of this model. With further training the model could achieve better results.
Training and evaluation data
For the training data we used 10000 samples from the multi-news train dataset. For the evaluation data we used 500 samples from the multi-news evaluation dataset.
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
5.2732 | 1.0 | 1250 | 3.2170 | 22.03 | 6.95 | 18.41 | 18.72 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 18
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Ssarion/mt5-small-multi-news
Base model
google/mt5-smallDataset used to train Ssarion/mt5-small-multi-news
Evaluation results
- Rouge1 on multi_newsvalidation set self-reported22.030
- Rouge2 on multi_newsvalidation set self-reported6.950
- Rougel on multi_newsvalidation set self-reported18.410
- Rougelsum on multi_newsvalidation set self-reported18.720