Quantization made by Richard Erkhov.
bart_samsum - bnb 4bits
- Model creator: https://huggingface.co/Arjun9/
- Original model: https://huggingface.co/Arjun9/bart_samsum/
Original model description:
license: mit base_model: facebook/bart-large-xsum tags: - generated_from_trainer metrics: - rouge - bleu model-index: - name: bart_samsum results: [] datasets: - samsum pipeline_tag: summarization
bart_samsum
This model is a fine-tuned version of facebook/bart-large-xsum on the samsum dataset. It achieves the following results on the evaluation set:
- Loss: 1.4947
- Rouge1: 53.3294
- Rouge2: 28.6009
- Rougel: 44.2008
- Rougelsum: 49.2031
- Bleu: 0.0
- Meteor: 0.4887
- Gen Len: 30.1209
Framework versions
- Transformers 4.40.0
- Pytorch 2.2.1+cu121
- Datasets 2.19.0
- Tokenizers 0.19.1