--- license: apache-2.0 base_model: facebook/bart-large tags: - generated_from_trainer metrics: - rouge - wer model-index: - name: bart_extractive_512_500 results: [] --- # bart_extractive_512_500 This model is a fine-tuned version of [facebook/bart-large](https://huggingface.co/facebook/bart-large) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.9749 - Rouge1: 0.7 - Rouge2: 0.4441 - Rougel: 0.6408 - Rougelsum: 0.6409 - Wer: 0.4458 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 6 - eval_batch_size: 6 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:------:| | No log | 0.13 | 250 | 1.2262 | 0.6523 | 0.3774 | 0.5876 | 0.5877 | 0.5064 | | 2.0992 | 0.27 | 500 | 1.1233 | 0.6736 | 0.4029 | 0.6091 | 0.6091 | 0.4868 | | 2.0992 | 0.4 | 750 | 1.1033 | 0.6826 | 0.4152 | 0.6187 | 0.6188 | 0.4768 | | 1.1914 | 0.53 | 1000 | 1.0645 | 0.6812 | 0.4159 | 0.6178 | 0.618 | 0.4713 | | 1.1914 | 0.66 | 1250 | 1.0493 | 0.6845 | 0.4206 | 0.6217 | 0.6219 | 0.4673 | | 1.1319 | 0.8 | 1500 | 1.0348 | 0.6906 | 0.427 | 0.6292 | 0.6292 | 0.4649 | | 1.1319 | 0.93 | 1750 | 1.0227 | 0.6893 | 0.4289 | 0.6286 | 0.6287 | 0.4596 | | 1.0853 | 1.06 | 2000 | 1.0093 | 0.6898 | 0.4297 | 0.6298 | 0.6298 | 0.4575 | | 1.0853 | 1.2 | 2250 | 1.0045 | 0.6981 | 0.4381 | 0.6376 | 0.6377 | 0.4547 | | 0.9975 | 1.33 | 2500 | 0.9967 | 0.6964 | 0.4394 | 0.6368 | 0.6369 | 0.4511 | | 0.9975 | 1.46 | 2750 | 0.9863 | 0.6995 | 0.4419 | 0.6401 | 0.6403 | 0.4495 | | 0.997 | 1.6 | 3000 | 0.9844 | 0.7016 | 0.4441 | 0.642 | 0.6421 | 0.4483 | | 0.997 | 1.73 | 3250 | 0.9819 | 0.6982 | 0.4431 | 0.6399 | 0.64 | 0.4476 | | 0.9651 | 1.86 | 3500 | 0.9746 | 0.6994 | 0.4441 | 0.6404 | 0.6406 | 0.4456 | | 0.9651 | 1.99 | 3750 | 0.9749 | 0.7 | 0.4441 | 0.6408 | 0.6409 | 0.4458 | ### Framework versions - Transformers 4.38.2 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2