uk-mt5-base-xlsum-v2
This model is a fine-tuned version of kravchenko/uk-mt5-base on the xlsum dataset. It achieves the following results on the evaluation set:
- Loss: 2.0401
- Rouge1: 4.4311
- Rouge2: 0.8944
- Rougel: 4.4294
- Rougelsum: 4.4527
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
3.2519 | 1.0 | 2000 | 2.0993 | 4.1141 | 0.5944 | 4.1014 | 4.11 |
2.5587 | 2.0 | 4000 | 2.0428 | 4.5015 | 0.6167 | 4.4863 | 4.518 |
2.3299 | 3.0 | 6000 | 2.0175 | 4.4642 | 1.0833 | 4.4528 | 4.5167 |
2.1543 | 4.0 | 8000 | 2.0183 | 4.3294 | 0.9444 | 4.3408 | 4.3611 |
2.0276 | 5.0 | 10000 | 2.0039 | 4.6694 | 0.9444 | 4.6264 | 4.6527 |
1.9119 | 6.0 | 12000 | 2.0139 | 4.9447 | 1.0675 | 4.8908 | 4.9633 |
1.8305 | 7.0 | 14000 | 2.0134 | 4.9385 | 1.1595 | 4.8774 | 4.9294 |
1.7669 | 8.0 | 16000 | 2.0253 | 4.2697 | 0.9667 | 4.2524 | 4.3167 |
1.7141 | 9.0 | 18000 | 2.0354 | 4.4527 | 0.9 | 4.448 | 4.4941 |
1.681 | 10.0 | 20000 | 2.0401 | 4.4311 | 0.8944 | 4.4294 | 4.4527 |
Framework versions
- Transformers 4.34.1
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for lljllll2219/uk-mt5-base-xlsum-v2
Base model
kravchenko/uk-mt5-base