File size: 3,654 Bytes
4115343
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
---
base_model: google/pegasus-large
tags:
- generated_from_trainer
metrics:
- rouge
- precision
- recall
- f1
model-index:
- name: LLM_Teached_Pegasus_100k_FS
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# LLM_Teached_Pegasus_100k_FS

This model is a fine-tuned version of [google/pegasus-large](https://huggingface.co/google/pegasus-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4469
- Rouge1: 0.4939
- Rouge2: 0.2453
- Rougel: 0.4133
- Rougelsum: 0.4134
- Gen Len: 25.9629
- Precision: 0.9133
- Recall: 0.9138
- F1: 0.9134

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 6
- total_train_batch_size: 96
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 16
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | F1     | Gen Len | Validation Loss | Precision | Recall | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:-----:|:------:|:-------:|:---------------:|:---------:|:------:|:------:|:------:|:------:|:---------:|
| 1.781         | 2.0   | 1388  | 0.9088 | 26.8891 | 1.5797          | 0.908     | 0.91   | 0.4708 | 0.2219 | 0.3892 | 0.389     |
| 1.6618        | 3.0   | 2083  | 0.91   | 26.7282 | 1.5411          | 0.9094    | 0.9111 | 0.4776 | 0.2303 | 0.3977 | 0.3973    |
| 1.626         | 4.0   | 2776  | 0.911  | 26.7596 | 1.5171          | 0.9102    | 0.9121 | 0.4834 | 0.2345 | 0.402  | 0.402     |
| 1.5918        | 5.0   | 3471  | 0.9112 | 26.6476 | 1.5001          | 0.9106    | 0.9122 | 0.4853 | 0.2365 | 0.4045 | 0.4045    |
| 1.5586        | 6.0   | 4164  | 0.9116 | 26.7778 | 1.4880          | 0.9108    | 0.9127 | 0.4875 | 0.2373 | 0.4063 | 0.4063    |
| 1.5375        | 7.0   | 4858  | 1.4768 | 0.4898  | 0.24            | 0.4083    | 0.4083 | 26.3991| 0.9116 | 0.9128 | 0.912     |
| 1.5146        | 8.0   | 5553  | 1.4686 | 0.4907  | 0.241           | 0.4088    | 0.4089 | 26.156 | 0.9123 | 0.9133 | 0.9126    |
| 1.5006        | 9.0   | 6247  | 1.4636 | 0.4914  | 0.2419          | 0.4097    | 0.4099 | 26.2629| 0.9122 | 0.9135 | 0.9127    |
| 1.49          | 10.0  | 6942  | 1.4580 | 0.4911  | 0.2429          | 0.4109    | 0.411  | 26.0273| 0.9125 | 0.9133 | 0.9127    |
| 1.4749        | 11.0  | 7636  | 1.4546 | 0.4932  | 0.244           | 0.4121    | 0.4123 | 26.2304| 0.9127 | 0.9138 | 0.9131    |
| 1.4661        | 12.0  | 8331  | 1.4514 | 0.4937  | 0.2448          | 0.4126    | 0.4127 | 25.8778| 0.9133 | 0.9136 | 0.9132    |
| 1.4575        | 13.0  | 9025  | 1.4499 | 0.4947  | 0.2453          | 0.4139    | 0.414  | 26.1151| 0.913  | 0.914  | 0.9133    |
| 1.4511        | 14.0  | 9720  | 1.4478 | 0.4939  | 0.2451          | 0.4133    | 0.4134 | 26.0287| 0.9131 | 0.9138 | 0.9133    |
| 1.4519        | 15.0  | 10414 | 1.4471 | 0.4938  | 0.2451          | 0.4134    | 0.4134 | 25.9078| 0.9132 | 0.9137 | 0.9133    |
| 1.4439        | 15.99 | 11104 | 1.4469 | 0.4939  | 0.2453          | 0.4133    | 0.4134 | 25.9629| 0.9133 | 0.9138 | 0.9134    |


### Framework versions

- Transformers 4.36.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.15.0