File size: 4,146 Bytes
ef5c25f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2c4c12b
ef5c25f
2c4c12b
 
 
 
 
ef5c25f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2c4c12b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ef5c25f
 
 
 
 
 
 
2c4c12b
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
tags:
- simplification
- generated_from_trainer
metrics:
- rouge
model-index:
- name: NASES-clara-med
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# NASES-clara-med

This model is a fine-tuned version of [ELiRF/NASES](https://huggingface.co/ELiRF/NASES) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.2763
- Rouge1: 42.9986
- Rouge2: 25.2365
- Rougel: 37.0782
- Rougelsum: 37.278

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| No log        | 1.0   | 190  | 2.1704          | 42.0923 | 24.011  | 36.2317 | 36.3819   |
| No log        | 2.0   | 380  | 2.1260          | 42.3364 | 24.6464 | 36.8836 | 37.0023   |
| 1.9093        | 3.0   | 570  | 2.1582          | 43.7464 | 26.0481 | 38.1321 | 38.2575   |
| 1.9093        | 4.0   | 760  | 2.2436          | 43.1348 | 25.6313 | 37.5276 | 37.688    |
| 0.7294        | 5.0   | 950  | 2.3852          | 43.9276 | 26.2853 | 38.1775 | 38.3529   |
| 0.7294        | 6.0   | 1140 | 2.5096          | 42.6241 | 25.1825 | 36.9084 | 37.1236   |
| 0.7294        | 7.0   | 1330 | 2.5986          | 43.4603 | 25.7703 | 37.762  | 38.0026   |
| 0.2438        | 8.0   | 1520 | 2.6878          | 42.483  | 24.6796 | 36.7012 | 36.9424   |
| 0.2438        | 9.0   | 1710 | 2.7096          | 43.3953 | 25.6418 | 37.4906 | 37.8048   |
| 0.1422        | 10.0  | 1900 | 2.7879          | 43.1926 | 25.3773 | 37.2548 | 37.4858   |
| 0.1422        | 11.0  | 2090 | 2.8629          | 43.7788 | 25.7912 | 37.6712 | 37.8664   |
| 0.1422        | 12.0  | 2280 | 2.9139          | 43.5132 | 25.6003 | 37.5426 | 37.7154   |
| 0.0911        | 13.0  | 2470 | 2.9267          | 43.2335 | 25.5807 | 37.4857 | 37.6547   |
| 0.0911        | 14.0  | 2660 | 2.9826          | 42.4726 | 24.6801 | 36.8142 | 36.9149   |
| 0.0704        | 15.0  | 2850 | 2.9834          | 42.7464 | 25.0051 | 37.0043 | 37.188    |
| 0.0704        | 16.0  | 3040 | 3.0423          | 42.7331 | 25.1076 | 36.8757 | 37.1165   |
| 0.0704        | 17.0  | 3230 | 3.0602          | 43.5046 | 25.9845 | 37.9281 | 38.0868   |
| 0.0529        | 18.0  | 3420 | 3.0882          | 42.7186 | 25.0104 | 36.943  | 37.1559   |
| 0.0529        | 19.0  | 3610 | 3.0713          | 43.0051 | 25.3356 | 37.0809 | 37.2836   |
| 0.0383        | 20.0  | 3800 | 3.1547          | 43.2239 | 25.3545 | 37.2722 | 37.4304   |
| 0.0383        | 21.0  | 3990 | 3.1408          | 43.2171 | 25.266  | 37.1733 | 37.4219   |
| 0.0383        | 22.0  | 4180 | 3.1739          | 43.1094 | 25.2674 | 37.3491 | 37.5596   |
| 0.0252        | 23.0  | 4370 | 3.2036          | 43.0451 | 25.3833 | 37.2896 | 37.469    |
| 0.0252        | 24.0  | 4560 | 3.2291          | 43.2983 | 25.5308 | 37.6024 | 37.7772   |
| 0.0173        | 25.0  | 4750 | 3.2607          | 43.0005 | 25.0403 | 37.2126 | 37.367    |
| 0.0173        | 26.0  | 4940 | 3.2498          | 42.869  | 24.9531 | 37.0616 | 37.2307   |
| 0.0173        | 27.0  | 5130 | 3.3016          | 43.1913 | 25.1199 | 37.2238 | 37.4256   |
| 0.0135        | 28.0  | 5320 | 3.2813          | 43.1867 | 25.2193 | 37.2014 | 37.4029   |
| 0.0135        | 29.0  | 5510 | 3.2757          | 42.9765 | 25.2217 | 37.0312 | 37.2317   |
| 0.0113        | 30.0  | 5700 | 3.2763          | 42.9986 | 25.2365 | 37.0782 | 37.278    |


### Framework versions

- Transformers 4.25.1
- Pytorch 1.13.0
- Datasets 2.8.0
- Tokenizers 0.12.1