File size: 4,792 Bytes
8697a89
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
---
license: mit
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-finetuned-roundup-32
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bart-large-cnn-finetuned-roundup-32

This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2324
- Rouge1: 46.462
- Rouge2: 25.9506
- Rougel: 29.4584
- Rougelsum: 44.1863
- Gen Len: 142.0

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 32
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| No log        | 1.0   | 132  | 1.3139          | 48.8247 | 29.2173 | 31.7628 | 45.8992   | 142.0   |
| No log        | 2.0   | 264  | 1.2287          | 47.9398 | 29.4061 | 30.9133 | 44.9142   | 140.9   |
| No log        | 3.0   | 396  | 1.2676          | 49.2743 | 30.4469 | 32.8893 | 46.6208   | 142.0   |
| 0.9578        | 4.0   | 528  | 1.3218          | 47.315  | 26.7303 | 30.5007 | 44.7654   | 142.0   |
| 0.9578        | 5.0   | 660  | 1.3173          | 47.1476 | 25.9408 | 29.4257 | 44.4956   | 142.0   |
| 0.9578        | 6.0   | 792  | 1.4283          | 47.5836 | 27.1572 | 29.8553 | 44.8858   | 142.0   |
| 0.9578        | 7.0   | 924  | 1.5005          | 46.6839 | 26.2214 | 30.1895 | 43.8753   | 140.75  |
| 0.3306        | 8.0   | 1056 | 1.5316          | 47.7611 | 27.1105 | 30.8142 | 44.7598   | 142.0   |
| 0.3306        | 9.0   | 1188 | 1.6295          | 48.4416 | 27.6912 | 30.3409 | 45.317    | 142.0   |
| 0.3306        | 10.0  | 1320 | 1.6564          | 46.5751 | 27.2306 | 29.7265 | 43.7327   | 142.0   |
| 0.3306        | 11.0  | 1452 | 1.7471          | 47.9684 | 27.5739 | 30.7018 | 44.6852   | 141.75  |
| 0.145         | 12.0  | 1584 | 1.7700          | 47.9274 | 28.5129 | 31.129  | 45.1009   | 142.0   |
| 0.145         | 13.0  | 1716 | 1.8391          | 49.8091 | 30.1597 | 33.6004 | 47.2007   | 141.95  |
| 0.145         | 14.0  | 1848 | 1.9212          | 45.2195 | 25.033  | 27.4181 | 42.6161   | 142.0   |
| 0.145         | 15.0  | 1980 | 1.9267          | 48.4959 | 28.1    | 31.2796 | 46.2758   | 142.0   |
| 0.0723        | 16.0  | 2112 | 1.9130          | 47.0765 | 27.4929 | 30.6862 | 44.1458   | 142.0   |
| 0.0723        | 17.0  | 2244 | 1.9514          | 48.5354 | 28.4909 | 31.8966 | 45.7116   | 142.0   |
| 0.0723        | 18.0  | 2376 | 2.0064          | 47.9339 | 28.6862 | 32.4472 | 45.3704   | 142.0   |
| 0.042         | 19.0  | 2508 | 2.0210          | 48.3169 | 28.1579 | 30.2681 | 45.3831   | 141.3   |
| 0.042         | 20.0  | 2640 | 2.0377          | 46.8156 | 26.0122 | 28.817  | 43.9383   | 142.0   |
| 0.042         | 21.0  | 2772 | 2.0587          | 46.3813 | 27.3555 | 29.875  | 43.6605   | 142.0   |
| 0.042         | 22.0  | 2904 | 2.0695          | 45.6728 | 26.0639 | 29.5653 | 42.3772   | 142.0   |
| 0.025         | 23.0  | 3036 | 2.1617          | 46.7283 | 26.2082 | 28.52   | 43.3304   | 142.0   |
| 0.025         | 24.0  | 3168 | 2.1375          | 48.1347 | 28.3444 | 31.7509 | 45.4907   | 142.0   |
| 0.025         | 25.0  | 3300 | 2.1911          | 47.3358 | 27.1479 | 29.4923 | 44.0087   | 142.0   |
| 0.025         | 26.0  | 3432 | 2.1806          | 47.2218 | 26.8421 | 30.03   | 44.2417   | 142.0   |
| 0.0153        | 27.0  | 3564 | 2.1890          | 46.3745 | 27.0095 | 29.7274 | 43.3372   | 142.0   |
| 0.0153        | 28.0  | 3696 | 2.2235          | 50.1274 | 30.8817 | 32.8766 | 46.7486   | 141.5   |
| 0.0153        | 29.0  | 3828 | 2.2236          | 50.1785 | 30.8079 | 32.8886 | 46.9888   | 142.0   |
| 0.0153        | 30.0  | 3960 | 2.2312          | 46.7468 | 26.4272 | 30.1175 | 43.9132   | 142.0   |
| 0.0096        | 31.0  | 4092 | 2.2287          | 47.558  | 26.3933 | 29.9122 | 44.5752   | 142.0   |
| 0.0096        | 32.0  | 4224 | 2.2324          | 46.462  | 25.9506 | 29.4584 | 44.1863   | 142.0   |


### Framework versions

- Transformers 4.18.0
- Pytorch 1.11.0+cu113
- Datasets 2.1.0
- Tokenizers 0.12.1