File size: 4,485 Bytes
1dce810
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
---
license: mit
base_model: arthurmluz/ptt5-xlsumm-30epochs
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: ptt5-xlsumm-cstnews-1024
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# ptt5-xlsumm-cstnews-1024

This model is a fine-tuned version of [arthurmluz/ptt5-xlsumm-30epochs](https://huggingface.co/arthurmluz/ptt5-xlsumm-30epochs) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1450
- Rouge1: 0.2635
- Rouge2: 0.2018
- Rougel: 0.2421
- Rougelsum: 0.2586
- Gen Len: 19.0

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| No log        | 1.0   | 47   | 1.3343          | 0.2102 | 0.108  | 0.1696 | 0.1927    | 19.0    |
| No log        | 2.0   | 94   | 1.2369          | 0.2236 | 0.1347 | 0.1817 | 0.2103    | 18.9677 |
| No log        | 3.0   | 141  | 1.1886          | 0.2301 | 0.1508 | 0.1939 | 0.2167    | 18.871  |
| No log        | 4.0   | 188  | 1.1635          | 0.2506 | 0.1844 | 0.2273 | 0.2415    | 18.871  |
| 1.6578        | 5.0   | 235  | 1.1491          | 0.2521 | 0.1888 | 0.2309 | 0.2462    | 18.871  |
| 1.6578        | 6.0   | 282  | 1.1369          | 0.2594 | 0.1985 | 0.2372 | 0.2527    | 18.871  |
| 1.6578        | 7.0   | 329  | 1.1308          | 0.2611 | 0.1998 | 0.2399 | 0.2543    | 18.871  |
| 1.6578        | 8.0   | 376  | 1.1277          | 0.2579 | 0.1959 | 0.2361 | 0.2521    | 18.871  |
| 1.259         | 9.0   | 423  | 1.1209          | 0.261  | 0.1967 | 0.2382 | 0.2544    | 19.0    |
| 1.259         | 10.0  | 470  | 1.1200          | 0.2625 | 0.1991 | 0.2403 | 0.2549    | 19.0    |
| 1.259         | 11.0  | 517  | 1.1163          | 0.2617 | 0.1995 | 0.2403 | 0.2555    | 19.0    |
| 1.259         | 12.0  | 564  | 1.1162          | 0.2662 | 0.2014 | 0.2435 | 0.26      | 19.0    |
| 1.1096        | 13.0  | 611  | 1.1183          | 0.2676 | 0.2029 | 0.2466 | 0.2614    | 19.0    |
| 1.1096        | 14.0  | 658  | 1.1149          | 0.2677 | 0.2015 | 0.2454 | 0.2611    | 19.0    |
| 1.1096        | 15.0  | 705  | 1.1182          | 0.2677 | 0.2015 | 0.2454 | 0.2611    | 19.0    |
| 1.1096        | 16.0  | 752  | 1.1211          | 0.2663 | 0.2043 | 0.2467 | 0.2616    | 19.0    |
| 1.1096        | 17.0  | 799  | 1.1246          | 0.2654 | 0.2018 | 0.2445 | 0.261     | 19.0    |
| 0.9916        | 18.0  | 846  | 1.1246          | 0.2665 | 0.2038 | 0.2455 | 0.2615    | 19.0    |
| 0.9916        | 19.0  | 893  | 1.1278          | 0.2661 | 0.2035 | 0.2457 | 0.2622    | 19.0    |
| 0.9916        | 20.0  | 940  | 1.1273          | 0.265  | 0.2028 | 0.2439 | 0.2614    | 19.0    |
| 0.9916        | 21.0  | 987  | 1.1326          | 0.2661 | 0.2035 | 0.2457 | 0.2622    | 19.0    |
| 0.9003        | 22.0  | 1034 | 1.1372          | 0.2656 | 0.2027 | 0.2449 | 0.2615    | 19.0    |
| 0.9003        | 23.0  | 1081 | 1.1406          | 0.264  | 0.1994 | 0.2418 | 0.2591    | 19.0    |
| 0.9003        | 24.0  | 1128 | 1.1407          | 0.2644 | 0.2015 | 0.2419 | 0.2591    | 19.0    |
| 0.9003        | 25.0  | 1175 | 1.1430          | 0.263  | 0.1998 | 0.2415 | 0.2586    | 19.0    |
| 0.8442        | 26.0  | 1222 | 1.1426          | 0.2635 | 0.2018 | 0.2421 | 0.2586    | 19.0    |
| 0.8442        | 27.0  | 1269 | 1.1439          | 0.2635 | 0.2018 | 0.2421 | 0.2586    | 19.0    |
| 0.8442        | 28.0  | 1316 | 1.1451          | 0.2635 | 0.2018 | 0.2421 | 0.2586    | 19.0    |
| 0.8442        | 29.0  | 1363 | 1.1448          | 0.2635 | 0.2018 | 0.2421 | 0.2586    | 19.0    |
| 0.8285        | 30.0  | 1410 | 1.1450          | 0.2635 | 0.2018 | 0.2421 | 0.2586    | 19.0    |


### Framework versions

- Transformers 4.34.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.14.1