File size: 4,367 Bytes
1e4f7bc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
---
base_model: silmi224/finetune-led-35000
tags:
- summarization
- generated_from_trainer
metrics:
- rouge
model-index:
- name: exp2-led-risalah_data_v5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# exp2-led-risalah_data_v5

This model is a fine-tuned version of [silmi224/finetune-led-35000](https://huggingface.co/silmi224/finetune-led-35000) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7589
- Rouge1: 26.7539
- Rouge2: 14.0212
- Rougel: 20.1731
- Rougelsum: 25.1551

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| 2.9869        | 1.0   | 10   | 2.6276          | 11.0083 | 2.5296  | 7.506   | 10.1145   |
| 2.8353        | 2.0   | 20   | 2.4357          | 13.1976 | 3.65    | 8.9898  | 12.4118   |
| 2.5921        | 3.0   | 30   | 2.2455          | 16.1167 | 5.4776  | 10.4243 | 14.5496   |
| 2.3757        | 4.0   | 40   | 2.1281          | 17.8354 | 6.4512  | 11.7275 | 16.9533   |
| 2.1921        | 5.0   | 50   | 2.0448          | 18.5671 | 6.3838  | 11.8151 | 16.7216   |
| 2.0619        | 6.0   | 60   | 1.9658          | 19.418  | 7.8173  | 11.8446 | 17.7673   |
| 1.9489        | 7.0   | 70   | 1.9050          | 19.6714 | 8.7398  | 11.7735 | 18.2189   |
| 1.8528        | 8.0   | 80   | 1.8411          | 20.675  | 7.8977  | 13.2651 | 19.5985   |
| 1.7631        | 9.0   | 90   | 1.8053          | 21.4919 | 8.5355  | 14.4864 | 20.2444   |
| 1.6934        | 10.0  | 100  | 1.7678          | 22.8921 | 10.0231 | 15.5104 | 21.1864   |
| 1.6315        | 11.0  | 110  | 1.7469          | 23.6054 | 10.4158 | 17.2392 | 22.3891   |
| 1.5724        | 12.0  | 120  | 1.7193          | 24.3411 | 10.8448 | 17.7772 | 22.8939   |
| 1.5203        | 13.0  | 130  | 1.7036          | 24.21   | 12.0234 | 17.3522 | 23.2189   |
| 1.4649        | 14.0  | 140  | 1.6940          | 23.8491 | 12.0368 | 17.3502 | 23.0718   |
| 1.416         | 15.0  | 150  | 1.6830          | 26.1747 | 12.5858 | 17.6622 | 25.0178   |
| 1.3721        | 16.0  | 160  | 1.6824          | 24.8559 | 12.6545 | 17.8682 | 24.0      |
| 1.3305        | 17.0  | 170  | 1.6648          | 25.7287 | 12.9393 | 18.9578 | 24.6492   |
| 1.2875        | 18.0  | 180  | 1.6513          | 23.8505 | 12.4778 | 18.1576 | 23.5659   |
| 1.246         | 19.0  | 190  | 1.6479          | 24.4501 | 12.8525 | 18.2542 | 23.6368   |
| 1.2084        | 20.0  | 200  | 1.6578          | 25.1775 | 12.5258 | 19.1736 | 24.2117   |
| 1.1673        | 21.0  | 210  | 1.6560          | 24.077  | 11.612  | 18.7053 | 22.9301   |
| 1.1305        | 22.0  | 220  | 1.6623          | 25.3731 | 12.5498 | 19.0849 | 24.5792   |
| 1.0883        | 23.0  | 230  | 1.6841          | 25.5239 | 13.1539 | 18.9111 | 24.822    |
| 1.0525        | 24.0  | 240  | 1.6613          | 25.1741 | 12.7783 | 18.5908 | 23.8229   |
| 1.0194        | 25.0  | 250  | 1.6836          | 25.3784 | 12.4683 | 18.4044 | 23.728    |
| 0.9778        | 26.0  | 260  | 1.7123          | 26.1912 | 13.5667 | 19.8968 | 25.3049   |
| 0.9443        | 27.0  | 270  | 1.7145          | 26.1743 | 14.4669 | 19.4401 | 25.6498   |
| 0.912         | 28.0  | 280  | 1.7265          | 25.2527 | 12.0503 | 18.376  | 23.9647   |
| 0.8774        | 29.0  | 290  | 1.7314          | 25.7905 | 12.9724 | 19.3096 | 24.8918   |
| 0.8425        | 30.0  | 300  | 1.7589          | 26.7539 | 14.0212 | 20.1731 | 25.1551   |


### Framework versions

- Transformers 4.41.2
- Pytorch 2.1.2
- Datasets 2.19.2
- Tokenizers 0.19.1