bart-base-samsum

This model was obtained by fine-tuning facebook/bart-base on Samsum dataset.

Usage

from transformers import pipeline

summarizer = pipeline("summarization", model="lidiya/bart-base-samsum")
conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker? 
Philipp: Sure you can use the new Hugging Face Deep Learning Container. 
Jeff: ok.
Jeff: and how can I get started? 
Jeff: where can I find documentation? 
Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face                                           
'''
summarizer(conversation)

Training procedure

Results

key value
eval_rouge1 46.6619
eval_rouge2 23.3285
eval_rougeL 39.4811
eval_rougeLsum 43.0482
test_rouge1 44.9932
test_rouge2 21.7286
test_rougeL 38.1921
test_rougeLsum 41.2672
Downloads last month
1,022
Hosted inference API
Summarization
This model can be loaded on the Inference API on-demand.