mt5-small-mlsum / README.md
LeoCordoba's picture
commit files to HF hub
d628a5c

mt5-small-mlsum

This model was trained using Amazon SageMaker and the new Hugging Face Deep Learning container. For more information look at:

Hyperparameters

{ "dataset_config": "es", "dataset_name": "mlsum", "do_eval": true, "do_predict": true, "do_train": true, "fp16": true, "max_target_length": 64, "model_name_or_path": "google/mt5-small", "num_train_epochs": 10, "output_dir": "/opt/ml/checkpoints", "per_device_eval_batch_size": 4, "per_device_train_batch_size": 4, "predict_with_generate": true, "sagemaker_container_log_level": 20, "sagemaker_program": "run_summarization.py", "save_strategy": "epoch", "seed": 7, "summary_column": "summary", "text_column": "text" }

Usage

Results

key value
eval_rouge1 26.4352
eval_rouge2 8.9293
eval_rougeL 21.2622
eval_rougeLsum 21.5518
test_rouge1 26.0756
test_rouge2 8.4669
test_rougeL 20.8167
test_rougeLsum 21.0822