File size: 3,641 Bytes
6ce4d8c
 
 
 
 
 
 
 
 
 
477bcd6
6ce4d8c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
477bcd6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
---
license: mit
base_model: facebook/bart-large-cnn
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-finetuned-small-context-news-1000
  results: []
pipeline_tag: summarization
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bart-large-cnn-finetuned-small-context-news-1000

This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9930
- Rouge1: 65.1207
- Rouge2: 55.5654
- Rougel: 60.1703
- Rougelsum: 61.6717
- Gen Len: 66.6529

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| No log        | 1.0   | 85   | 0.4915          | 61.0185 | 47.1863 | 53.5499 | 55.4476   | 66.2824 |
| No log        | 2.0   | 170  | 0.5558          | 63.1675 | 51.7011 | 57.0742 | 58.1801   | 67.2235 |
| No log        | 3.0   | 255  | 0.5447          | 64.6201 | 54.8904 | 59.8669 | 60.7456   | 67.4529 |
| No log        | 4.0   | 340  | 0.5770          | 65.2542 | 54.571  | 59.89   | 61.0988   | 65.0941 |
| No log        | 5.0   | 425  | 0.6406          | 64.8868 | 54.2641 | 59.2758 | 60.4861   | 67.4118 |
| 0.2062        | 6.0   | 510  | 0.6468          | 65.1216 | 54.5784 | 59.3594 | 60.3826   | 66.7529 |
| 0.2062        | 7.0   | 595  | 0.6828          | 64.162  | 54.1786 | 59.1392 | 60.2517   | 67.4412 |
| 0.2062        | 8.0   | 680  | 0.7481          | 64.6093 | 54.4423 | 59.9194 | 61.1767   | 66.2647 |
| 0.2062        | 9.0   | 765  | 0.7916          | 65.0347 | 55.2975 | 60.3007 | 61.4619   | 67.8471 |
| 0.2062        | 10.0  | 850  | 0.7699          | 65.672  | 55.5276 | 60.3711 | 61.5138   | 66.9529 |
| 0.2062        | 11.0  | 935  | 0.7712          | 65.7327 | 55.9363 | 61.0215 | 62.1639   | 65.7294 |
| 0.0273        | 12.0  | 1020 | 0.9920          | 65.2328 | 55.3817 | 60.0671 | 61.4812   | 66.3588 |
| 0.0273        | 13.0  | 1105 | 0.8023          | 65.2372 | 55.2458 | 60.2251 | 61.5193   | 65.4824 |
| 0.0273        | 14.0  | 1190 | 0.8660          | 65.0369 | 55.2548 | 59.8089 | 61.3785   | 68.0353 |
| 0.0273        | 15.0  | 1275 | 0.9539          | 65.4251 | 55.1068 | 60.2355 | 61.6598   | 66.7765 |
| 0.0273        | 16.0  | 1360 | 0.8840          | 65.544  | 55.951  | 59.9112 | 61.6029   | 66.7529 |
| 0.0273        | 17.0  | 1445 | 0.9141          | 65.7685 | 55.4981 | 60.575  | 62.2381   | 66.4882 |
| 0.009         | 18.0  | 1530 | 1.0024          | 65.4152 | 55.7546 | 60.5256 | 62.0985   | 67.2412 |
| 0.009         | 19.0  | 1615 | 0.9997          | 65.0153 | 55.1772 | 60.103  | 61.4286   | 66.3529 |
| 0.009         | 20.0  | 1700 | 0.9930          | 65.1207 | 55.5654 | 60.1703 | 61.6717   | 66.6529 |


### Framework versions

- Transformers 4.38.1
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.2