File size: 4,147 Bytes
ef5c25f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2c4c12b
ef5c25f
4adb0d6
 
 
 
 
ef5c25f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4adb0d6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ef5c25f
 
 
 
 
 
 
2c4c12b
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
tags:
- simplification
- generated_from_trainer
metrics:
- rouge
model-index:
- name: NASES-clara-med
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# NASES-clara-med

This model is a fine-tuned version of [ELiRF/NASES](https://huggingface.co/ELiRF/NASES) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.2666
- Rouge1: 44.0787
- Rouge2: 26.1429
- Rougel: 38.4286
- Rougelsum: 38.5202

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| No log        | 1.0   | 190  | 2.1442          | 43.6265 | 25.4681 | 37.6224 | 37.8012   |
| No log        | 2.0   | 380  | 2.0839          | 44.0795 | 25.8075 | 37.9463 | 38.0445   |
| 1.8145        | 3.0   | 570  | 2.1689          | 43.3863 | 25.7517 | 37.4822 | 37.7461   |
| 1.8145        | 4.0   | 760  | 2.2569          | 43.9293 | 25.7951 | 37.9177 | 38.0658   |
| 0.6803        | 5.0   | 950  | 2.3760          | 43.9972 | 26.1618 | 38.4315 | 38.5305   |
| 0.6803        | 6.0   | 1140 | 2.4979          | 44.7986 | 27.0088 | 39.0031 | 39.1731   |
| 0.6803        | 7.0   | 1330 | 2.5881          | 43.8723 | 25.9782 | 38.1705 | 38.3225   |
| 0.2323        | 8.0   | 1520 | 2.6624          | 43.851  | 25.9263 | 38.2445 | 38.3659   |
| 0.2323        | 9.0   | 1710 | 2.7113          | 43.5292 | 25.4795 | 37.6883 | 37.8992   |
| 0.1464        | 10.0  | 1900 | 2.7451          | 44.6014 | 27.0125 | 38.9456 | 39.1796   |
| 0.1464        | 11.0  | 2090 | 2.7932          | 43.9568 | 26.0931 | 38.3672 | 38.5118   |
| 0.1464        | 12.0  | 2280 | 2.8651          | 43.8429 | 25.9007 | 38.0691 | 38.191    |
| 0.0863        | 13.0  | 2470 | 2.8978          | 44.192  | 26.1818 | 38.4167 | 38.579    |
| 0.0863        | 14.0  | 2660 | 2.9279          | 43.6745 | 25.6503 | 37.8948 | 38.0051   |
| 0.0657        | 15.0  | 2850 | 2.9942          | 44.1633 | 25.7856 | 38.0295 | 38.1905   |
| 0.0657        | 16.0  | 3040 | 2.9843          | 44.0347 | 25.9893 | 38.3486 | 38.5219   |
| 0.0657        | 17.0  | 3230 | 3.0189          | 44.3013 | 26.1884 | 38.5594 | 38.7396   |
| 0.0473        | 18.0  | 3420 | 3.0837          | 43.5877 | 25.6931 | 38.1147 | 38.2258   |
| 0.0473        | 19.0  | 3610 | 3.1025          | 44.1191 | 25.9657 | 38.338  | 38.5039   |
| 0.0302        | 20.0  | 3800 | 3.1395          | 44.393  | 26.3189 | 38.7891 | 38.8664   |
| 0.0302        | 21.0  | 3990 | 3.1808          | 44.4783 | 26.3023 | 38.4714 | 38.6428   |
| 0.0302        | 22.0  | 4180 | 3.1388          | 44.6364 | 26.7442 | 38.9591 | 39.1097   |
| 0.0194        | 23.0  | 4370 | 3.1859          | 44.919  | 26.9807 | 39.2653 | 39.3442   |
| 0.0194        | 24.0  | 4560 | 3.2126          | 44.4693 | 26.6534 | 38.8354 | 38.9278   |
| 0.0159        | 25.0  | 4750 | 3.1988          | 44.5436 | 26.63   | 38.9413 | 39.0007   |
| 0.0159        | 26.0  | 4940 | 3.2539          | 44.0378 | 26.0958 | 38.4445 | 38.5443   |
| 0.0159        | 27.0  | 5130 | 3.2844          | 44.6057 | 26.476  | 38.6502 | 38.7949   |
| 0.0117        | 28.0  | 5320 | 3.2755          | 44.1804 | 26.3747 | 38.6084 | 38.7027   |
| 0.0117        | 29.0  | 5510 | 3.2731          | 44.0453 | 26.0298 | 38.3911 | 38.4826   |
| 0.0102        | 30.0  | 5700 | 3.2666          | 44.0787 | 26.1429 | 38.4286 | 38.5202   |


### Framework versions

- Transformers 4.25.1
- Pytorch 1.13.0
- Datasets 2.8.0
- Tokenizers 0.12.1