File size: 2,036 Bytes
07fe2fe
 
 
 
 
 
 
 
 
 
 
ee58199
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
07fe2fe
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
language: en
license: apache-2.0
tags:
- summarization
datasets:
- cnn_dailymail
metrics:
- R1
- R2
- RL
model-index:
- name: echarlaix/bart-base-cnn-r2-18.7-d23-hybrid
  results:
  - task:
      type: summarization
      name: Summarization
    dataset:
      name: cnn_dailymail
      type: cnn_dailymail
      config: 3.0.0
      split: test
    metrics:
    - name: ROUGE-1
      type: rouge
      value: 23.7908
      verified: true
    - name: ROUGE-2
      type: rouge
      value: 11.3439
      verified: true
    - name: ROUGE-L
      type: rouge
      value: 19.7608
      verified: true
    - name: ROUGE-LSUM
      type: rouge
      value: 22.3485
      verified: true
    - name: loss
      type: loss
      value: 2.0443272590637207
      verified: true
    - name: gen_len
      type: gen_len
      value: 19.9996
      verified: true
---

## facebook/bart-base model fine-tuned on CNN/DailyMail

This model was created using the [nn_pruning](https://github.com/huggingface/nn_pruning) python library: the linear layers contains **23%** of the original  weights.



The model contains **45%** of the original weights **overall** (the embeddings account for a significant part of the model, and they are not pruned by this method).

<div class="graph"><script src="/echarlaix/bart-base-cnn-r2-18.7-d23-hybrid/raw/main/model_card/density_info.js" id="4348cd46-05bd-4e27-b565-6693f9e0b03e"></script></div>


## Fine-Pruning details
This model was fine-tuned from the HuggingFace [model](https://huggingface.co/facebook/bart-base).
A side-effect of block pruning is that some of the attention heads are completely removed: 61 heads were removed on a total of 216 (28.2%).

## Details of the CNN/DailyMail dataset

|    Dataset    | Split | # samples |
| ------------- | ----- | --------- |
| CNN/DailyMail | train |   287K    |
| CNN/DailyMail | eval  |    13K    |

### Results


|    Metric   | # Value   |
| ----------- | --------- |
| **Rouge 1** | **41.43** |
| **Rouge 2** | **18.72** |
| **Rouge L** | **38.35** |