File size: 4,294 Bytes
ef5c25f
 
 
 
 
 
d5c4c27
ef5c25f
 
 
d5c4c27
 
 
 
 
ef5c25f
 
 
 
 
 
 
d5c4c27
ef5c25f
 
 
 
 
 
d5c4c27
ef5c25f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d5c4c27
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
tags:
- simplification
- generated_from_trainer
metrics:
- rouge
- sari
model-index:
- name: NASES-clara-med
  results: []
license: cc-by-nc-sa-4.0
datasets:
- lcampillos/CLARA-MeD
language:
- es
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# NASES-clara-med

This model is a fine-tuned version of [ELiRF/NASES](https://huggingface.co/ELiRF/NASES) on the [CLARA-MeD](https://huggingface.co/lcampillos/CLARA-MeD) dataset.
It achieves the following results on the evaluation set:
- Loss: 3.1754
- Rouge1: 45.2398
- Rouge2: 27.7502
- Rougel: 39.4698
- Rougelsum: 39.7208
- SARI: 49.5333

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| No log        | 1.0   | 190  | 2.1363          | 43.1891 | 26.0464 | 37.7101 | 37.8669   |
| No log        | 2.0   | 380  | 2.0887          | 44.3709 | 26.66   | 38.7491 | 38.9616   |
| 1.8749        | 3.0   | 570  | 2.0998          | 45.3838 | 27.6296 | 39.6766 | 39.8786   |
| 1.8749        | 4.0   | 760  | 2.2080          | 45.3734 | 27.9361 | 39.8229 | 39.9957   |
| 0.6851        | 5.0   | 950  | 2.3240          | 44.8206 | 27.4094 | 39.0302 | 39.249    |
| 0.6851        | 6.0   | 1140 | 2.4336          | 45.2087 | 27.6721 | 39.6997 | 39.9306   |
| 0.6851        | 7.0   | 1330 | 2.5224          | 45.3472 | 28.0703 | 39.9099 | 40.1756   |
| 0.2487        | 8.0   | 1520 | 2.5796          | 45.215  | 27.7175 | 39.5083 | 39.7442   |
| 0.2487        | 9.0   | 1710 | 2.6675          | 45.3478 | 27.5316 | 39.7082 | 39.9943   |
| 0.1383        | 10.0  | 1900 | 2.7055          | 44.6361 | 27.3284 | 38.8978 | 39.1641   |
| 0.1383        | 11.0  | 2090 | 2.7401          | 45.537  | 27.9101 | 39.8044 | 40.0529   |
| 0.1383        | 12.0  | 2280 | 2.7837          | 45.3551 | 27.7135 | 39.6413 | 39.8563   |
| 0.0866        | 13.0  | 2470 | 2.8190          | 45.9865 | 28.3685 | 40.3313 | 40.626    |
| 0.0866        | 14.0  | 2660 | 2.8380          | 45.3839 | 27.9721 | 39.8318 | 40.0786   |
| 0.065         | 15.0  | 2850 | 2.9169          | 45.3779 | 27.8374 | 39.7026 | 39.9432   |
| 0.065         | 16.0  | 3040 | 2.9225          | 45.3323 | 27.6681 | 39.5425 | 39.8021   |
| 0.065         | 17.0  | 3230 | 2.9558          | 45.507  | 28.2007 | 40.0316 | 40.3505   |
| 0.0465        | 18.0  | 3420 | 3.0746          | 45.5661 | 27.6864 | 39.7771 | 40.042    |
| 0.0465        | 19.0  | 3610 | 3.0260          | 45.4173 | 28.1651 | 39.9385 | 40.265    |
| 0.0287        | 20.0  | 3800 | 2.9955          | 44.8573 | 27.7183 | 39.3235 | 39.6152   |
| 0.0287        | 21.0  | 3990 | 3.0956          | 44.9341 | 27.481  | 39.4431 | 39.6973   |
| 0.0287        | 22.0  | 4180 | 3.1569          | 44.8046 | 27.4202 | 38.9288 | 39.2948   |
| 0.0205        | 23.0  | 4370 | 3.1127          | 45.6665 | 27.9091 | 39.9312 | 40.1756   |
| 0.0205        | 24.0  | 4560 | 3.1214          | 45.2634 | 27.757  | 39.6646 | 39.9734   |
| 0.0149        | 25.0  | 4750 | 3.1522          | 45.4023 | 27.961  | 39.6511 | 39.9969   |
| 0.0149        | 26.0  | 4940 | 3.1694          | 45.3276 | 27.7616 | 39.5195 | 39.776    |
| 0.0149        | 27.0  | 5130 | 3.1682          | 45.4472 | 27.8223 | 39.6778 | 39.9427   |
| 0.0126        | 28.0  | 5320 | 3.1421          | 45.4602 | 27.9026 | 39.8116 | 40.1192   |
| 0.0126        | 29.0  | 5510 | 3.1576          | 45.4435 | 27.9545 | 39.7496 | 39.9925   |
| 0.01          | 30.0  | 5700 | 3.1754          | 45.2398 | 27.7502 | 39.4698 | 39.7208   |


### Framework versions

- Transformers 4.25.1
- Pytorch 1.13.0
- Datasets 2.8.0
- Tokenizers 0.12.1