--- license: apache-2.0 tags: - generated_from_trainer datasets: - cnn_dailymail metrics: - rouge model-index: - name: led-large-16384-cnn_dailymail results: - task: name: Sequence-to-sequence Language Modeling type: text2text-generation dataset: name: cnn_dailymail type: cnn_dailymail config: 3.0.0 split: test args: 3.0.0 metrics: - name: Rouge1 type: rouge value: 0.38289524455734836 --- # led-large-16384-cnn_dailymail This model is a fine-tuned version of [allenai/led-base-16384](https://huggingface.co/allenai/led-base-16384) on the cnn_dailymail dataset. It achieves the following results on the evaluation set: - Loss: 1.5981 - Rouge1: 0.3829 - Rouge2: 0.1704 - Rougel: 0.2569 - Rougelsum: 0.3614 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 64 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 6 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:| | 1.9531 | 0.4 | 500 | 1.8639 | 0.3485 | 0.1441 | 0.2275 | 0.3288 | | 1.9563 | 0.8 | 1000 | 1.8260 | 0.3538 | 0.1482 | 0.2315 | 0.3343 | | 1.7176 | 1.2 | 1500 | 1.8208 | 0.3628 | 0.1527 | 0.2383 | 0.3433 | | 1.7197 | 1.6 | 2000 | 1.8162 | 0.3696 | 0.1602 | 0.2434 | 0.3486 | | 1.8086 | 2.0 | 2500 | 1.7924 | 0.3558 | 0.1533 | 0.2334 | 0.3361 | | 1.2448 | 2.4 | 3000 | 1.8510 | 0.3703 | 0.1591 | 0.2447 | 0.3483 | | 1.3574 | 2.8 | 3500 | 1.8277 | 0.3741 | 0.1593 | 0.2422 | 0.3540 | | 1.0966 | 3.2 | 4000 | 1.8924 | 0.3682 | 0.1576 | 0.2424 | 0.3479 | | 0.9938 | 3.6 | 4500 | 1.8957 | 0.3723 | 0.1599 | 0.2451 | 0.3511 | | 1.0735 | 4.0 | 5000 | 1.8772 | 0.3653 | 0.1557 | 0.2399 | 0.3454 | | 0.9106 | 4.4 | 5500 | 1.9401 | 0.3720 | 0.1585 | 0.2436 | 0.3504 | | 1.015 | 4.8 | 6000 | 1.9320 | 0.3725 | 0.1570 | 0.2429 | 0.3515 | | 1.7854 | 0.36 | 6500 | 1.7800 | 0.3624 | 0.1544 | 0.2390 | 0.3422 | | 1.9079 | 0.39 | 7000 | 1.7629 | 0.3573 | 0.1553 | 0.2352 | 0.3370 | | 1.7606 | 3.34 | 7500 | 1.6902 | 0.3783 | 0.1673 | 0.2521 | 0.3570 | | 1.7571 | 3.57 | 8000 | 1.6563 | 0.3802 | 0.1691 | 0.2538 | 0.3587 | | 1.6602 | 3.79 | 8500 | 1.6439 | 0.3814 | 0.1693 | 0.2548 | 0.3600 | | 1.6614 | 4.01 | 9000 | 1.6312 | 0.3812 | 0.1691 | 0.2544 | 0.3599 | | 1.668 | 4.24 | 9500 | 1.6189 | 0.3815 | 0.1689 | 0.2550 | 0.3603 | | 1.6491 | 4.46 | 10000 | 1.6172 | 0.3799 | 0.1681 | 0.2540 | 0.3586 | | 1.5994 | 4.68 | 10500 | 1.6132 | 0.3825 | 0.1702 | 0.2560 | 0.3610 | | 1.6493 | 4.9 | 11000 | 1.6093 | 0.3828 | 0.1701 | 0.2561 | 0.3613 | | 1.6769 | 5.13 | 11500 | 1.6074 | 0.3831 | 0.1706 | 0.2569 | 0.3619 | | 1.6554 | 5.35 | 12000 | 1.6044 | 0.3817 | 0.1695 | 0.2559 | 0.3605 | | 1.6155 | 5.57 | 12500 | 1.6010 | 0.3825 | 0.1700 | 0.2561 | 0.3608 | | 1.5863 | 5.8 | 13000 | 1.5981 | 0.3829 | 0.1704 | 0.2569 | 0.3614 | ### Framework versions - Transformers 4.30.2 - Pytorch 1.13.1 - Datasets 2.13.0 - Tokenizers 0.13.3