File size: 7,986 Bytes
9cfbf51
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: summarise_v11
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# summarise_v11

This model is a fine-tuned version of [allenai/led-base-16384](https://huggingface.co/allenai/led-base-16384) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6322
- Rouge1 Precision: 0.6059
- Rouge1 Recall: 0.6233
- Rouge1 Fmeasure: 0.5895
- Rouge2 Precision: 0.4192
- Rouge2 Recall: 0.4512
- Rouge2 Fmeasure: 0.4176
- Rougel Precision: 0.4622
- Rougel Recall: 0.4946
- Rougel Fmeasure: 0.4566
- Rougelsum Precision: 0.4622
- Rougelsum Recall: 0.4946
- Rougelsum Fmeasure: 0.4566

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1 Precision | Rouge1 Recall | Rouge1 Fmeasure | Rouge2 Precision | Rouge2 Recall | Rouge2 Fmeasure | Rougel Precision | Rougel Recall | Rougel Fmeasure | Rougelsum Precision | Rougelsum Recall | Rougelsum Fmeasure |
|:-------------:|:-----:|:----:|:---------------:|:----------------:|:-------------:|:---------------:|:----------------:|:-------------:|:---------------:|:----------------:|:-------------:|:---------------:|:-------------------:|:----------------:|:------------------:|
| 1.6201        | 0.45  | 10   | 1.4875          | 0.3203           | 0.64          | 0.3932          | 0.197            | 0.3839        | 0.2385          | 0.1952           | 0.4051        | 0.2454          | 0.1952              | 0.4051           | 0.2454             |
| 0.9172        | 0.91  | 20   | 1.4404          | 0.4917           | 0.5134        | 0.4699          | 0.288            | 0.3095        | 0.276           | 0.3371           | 0.3594        | 0.3277          | 0.3371              | 0.3594           | 0.3277             |
| 1.0923        | 1.36  | 30   | 1.3575          | 0.519            | 0.5505        | 0.4936          | 0.3114           | 0.3237        | 0.2958          | 0.3569           | 0.3702        | 0.3364          | 0.3569              | 0.3702           | 0.3364             |
| 1.1287        | 1.82  | 40   | 1.3269          | 0.4913           | 0.5997        | 0.5068          | 0.3108           | 0.3964        | 0.3269          | 0.3355           | 0.427         | 0.3521          | 0.3355              | 0.427            | 0.3521             |
| 0.9938        | 2.27  | 50   | 1.3189          | 0.5339           | 0.5781        | 0.4973          | 0.3555           | 0.3883        | 0.3345          | 0.3914           | 0.4289        | 0.3678          | 0.3914              | 0.4289           | 0.3678             |
| 0.8659        | 2.73  | 60   | 1.3241          | 0.525            | 0.638         | 0.5165          | 0.3556           | 0.4349        | 0.3535          | 0.3914           | 0.4793        | 0.3886          | 0.3914              | 0.4793           | 0.3886             |
| 0.6187        | 3.18  | 70   | 1.3360          | 0.5875           | 0.5864        | 0.5416          | 0.4005           | 0.4045        | 0.3701          | 0.4485           | 0.4556        | 0.414           | 0.4485              | 0.4556           | 0.414              |
| 0.3941        | 3.64  | 80   | 1.4176          | 0.5373           | 0.6415        | 0.5328          | 0.3576           | 0.446         | 0.3642          | 0.3787           | 0.4586        | 0.3781          | 0.3787              | 0.4586           | 0.3781             |
| 0.4145        | 4.09  | 90   | 1.3936          | 0.4127           | 0.6553        | 0.4568          | 0.2568           | 0.4498        | 0.2988          | 0.2918           | 0.4933        | 0.328           | 0.2918              | 0.4933           | 0.328              |
| 0.4203        | 4.55  | 100  | 1.4703          | 0.6545           | 0.601         | 0.5981          | 0.4789           | 0.4373        | 0.438           | 0.5251           | 0.4851        | 0.4818          | 0.5251              | 0.4851           | 0.4818             |
| 0.687         | 5.0   | 110  | 1.4304          | 0.5566           | 0.6357        | 0.5637          | 0.3734           | 0.4186        | 0.3748          | 0.4251           | 0.4825        | 0.4286          | 0.4251              | 0.4825           | 0.4286             |
| 0.4006        | 5.45  | 120  | 1.5399          | 0.5994           | 0.5794        | 0.5515          | 0.4215           | 0.4218        | 0.398           | 0.4359           | 0.4369        | 0.4084          | 0.4359              | 0.4369           | 0.4084             |
| 0.2536        | 5.91  | 130  | 1.5098          | 0.5074           | 0.6254        | 0.4874          | 0.3369           | 0.4189        | 0.3256          | 0.3802           | 0.4738        | 0.3664          | 0.3802              | 0.4738           | 0.3664             |
| 0.2218        | 6.36  | 140  | 1.5278          | 0.5713           | 0.6059        | 0.5688          | 0.3887           | 0.4233        | 0.3916          | 0.4414           | 0.4795        | 0.4457          | 0.4414              | 0.4795           | 0.4457             |
| 0.2577        | 6.82  | 150  | 1.5469          | 0.5148           | 0.5941        | 0.5175          | 0.3284           | 0.3856        | 0.3335          | 0.3616           | 0.4268        | 0.3681          | 0.3616              | 0.4268           | 0.3681             |
| 0.1548        | 7.27  | 160  | 1.5986          | 0.5983           | 0.657         | 0.5862          | 0.4322           | 0.4877        | 0.4287          | 0.4466           | 0.5167        | 0.4482          | 0.4466              | 0.5167           | 0.4482             |
| 0.1535        | 7.73  | 170  | 1.5796          | 0.5609           | 0.641         | 0.5616          | 0.3856           | 0.4428        | 0.3892          | 0.4238           | 0.4921        | 0.4263          | 0.4238              | 0.4921           | 0.4263             |
| 0.1568        | 8.18  | 180  | 1.6052          | 0.5669           | 0.617         | 0.5679          | 0.3911           | 0.4382        | 0.3969          | 0.4363           | 0.4877        | 0.4417          | 0.4363              | 0.4877           | 0.4417             |
| 0.2038        | 8.64  | 190  | 1.6191          | 0.5466           | 0.5973        | 0.5313          | 0.3543           | 0.4114        | 0.3531          | 0.4061           | 0.4666        | 0.404           | 0.4061              | 0.4666           | 0.404              |
| 0.1808        | 9.09  | 200  | 1.6165          | 0.5751           | 0.5919        | 0.5587          | 0.3831           | 0.4097        | 0.3817          | 0.4482           | 0.4728        | 0.4405          | 0.4482              | 0.4728           | 0.4405             |
| 0.1021        | 9.55  | 210  | 1.6316          | 0.5316           | 0.6315        | 0.535           | 0.3588           | 0.4563        | 0.3697          | 0.405            | 0.502         | 0.4126          | 0.405               | 0.502            | 0.4126             |
| 0.1407        | 10.0  | 220  | 1.6322          | 0.6059           | 0.6233        | 0.5895          | 0.4192           | 0.4512        | 0.4176          | 0.4622           | 0.4946        | 0.4566          | 0.4622              | 0.4946           | 0.4566             |


### Framework versions

- Transformers 4.21.3
- Pytorch 1.12.1+cu113
- Datasets 1.2.1
- Tokenizers 0.12.1