facebook/bart-base model fine-tuned on CNN/DailyMail

This model was created using the nn_pruning python library: the linear layers contains 35% of the original weights.

The model contains 53% of the original weights overall (the embeddings account for a significant part of the model, and they are not pruned by this method).

Fine-Pruning details

This model was fine-tuned from the HuggingFace model.

A side-effect of the block pruning is that some of the attention heads are completely removed: 38 heads were removed on a total of 216 (17.6%).

Details of the CNN/DailyMail dataset

Dataset Split # samples
CNN/DailyMail train 287K
CNN/DailyMail eval 13K

Results

Metric # Value
Rouge 1 42.18
Rouge 2 19.44
Rouge L 39.17
New

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
90
Hosted inference API
Summarization
Examples
Examples
This model can be loaded on the Inference API on-demand.