File size: 4,493 Bytes
acaaecb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
---
license: apache-2.0
base_model: GanjinZero/biobart-v2-base
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: fine-tuned-2048-inputs-30-epochs
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# fine-tuned-2048-inputs-30-epochs

This model is a fine-tuned version of [GanjinZero/biobart-v2-base](https://huggingface.co/GanjinZero/biobart-v2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8273
- Rouge1: 0.2933
- Rouge2: 0.1173
- Rougel: 0.2662
- Rougelsum: 0.2653
- Gen Len: 15.53

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| No log        | 1.0   | 151  | 0.7529          | 0.2082 | 0.0781 | 0.1878 | 0.1884    | 13.16   |
| No log        | 2.0   | 302  | 0.7144          | 0.2589 | 0.0817 | 0.2239 | 0.2244    | 13.76   |
| No log        | 3.0   | 453  | 0.6993          | 0.2409 | 0.0773 | 0.2135 | 0.2136    | 14.52   |
| 0.7226        | 4.0   | 604  | 0.6957          | 0.2891 | 0.1014 | 0.262  | 0.2618    | 14.27   |
| 0.7226        | 5.0   | 755  | 0.7037          | 0.2925 | 0.1167 | 0.2656 | 0.267     | 14.73   |
| 0.7226        | 6.0   | 906  | 0.6971          | 0.2778 | 0.1124 | 0.2511 | 0.2501    | 14.92   |
| 0.4948        | 7.0   | 1057 | 0.7117          | 0.2816 | 0.1139 | 0.2558 | 0.2553    | 14.93   |
| 0.4948        | 8.0   | 1208 | 0.7185          | 0.2948 | 0.1192 | 0.2683 | 0.2679    | 14.45   |
| 0.4948        | 9.0   | 1359 | 0.7250          | 0.3039 | 0.1108 | 0.2748 | 0.2738    | 14.76   |
| 0.368         | 10.0  | 1510 | 0.7343          | 0.3187 | 0.1267 | 0.2921 | 0.2919    | 14.67   |
| 0.368         | 11.0  | 1661 | 0.7418          | 0.3067 | 0.1205 | 0.278  | 0.2772    | 15.23   |
| 0.368         | 12.0  | 1812 | 0.7521          | 0.3023 | 0.1134 | 0.2764 | 0.2756    | 14.91   |
| 0.368         | 13.0  | 1963 | 0.7556          | 0.2945 | 0.1143 | 0.272  | 0.2713    | 15.01   |
| 0.2865        | 14.0  | 2114 | 0.7636          | 0.3163 | 0.1246 | 0.2943 | 0.2942    | 15.44   |
| 0.2865        | 15.0  | 2265 | 0.7722          | 0.2987 | 0.1105 | 0.2705 | 0.2703    | 14.93   |
| 0.2865        | 16.0  | 2416 | 0.7788          | 0.3047 | 0.1091 | 0.2745 | 0.2744    | 15.29   |
| 0.2221        | 17.0  | 2567 | 0.7834          | 0.2973 | 0.113  | 0.2698 | 0.269     | 15.11   |
| 0.2221        | 18.0  | 2718 | 0.7905          | 0.2933 | 0.1139 | 0.2612 | 0.2595    | 15.1    |
| 0.2221        | 19.0  | 2869 | 0.7945          | 0.2936 | 0.1036 | 0.2637 | 0.2624    | 15.5    |
| 0.1825        | 20.0  | 3020 | 0.8033          | 0.3167 | 0.1216 | 0.2839 | 0.2837    | 15.54   |
| 0.1825        | 21.0  | 3171 | 0.8009          | 0.3056 | 0.1139 | 0.2753 | 0.2747    | 15.69   |
| 0.1825        | 22.0  | 3322 | 0.8085          | 0.2974 | 0.113  | 0.2632 | 0.2621    | 15.37   |
| 0.1825        | 23.0  | 3473 | 0.8120          | 0.3063 | 0.1191 | 0.2746 | 0.2749    | 15.48   |
| 0.1498        | 24.0  | 3624 | 0.8163          | 0.3045 | 0.1114 | 0.2736 | 0.2724    | 15.47   |
| 0.1498        | 25.0  | 3775 | 0.8197          | 0.3091 | 0.1147 | 0.2789 | 0.2788    | 15.51   |
| 0.1498        | 26.0  | 3926 | 0.8212          | 0.3003 | 0.1211 | 0.2715 | 0.2718    | 15.59   |
| 0.1329        | 27.0  | 4077 | 0.8230          | 0.3046 | 0.1158 | 0.2751 | 0.275     | 15.5    |
| 0.1329        | 28.0  | 4228 | 0.8250          | 0.2871 | 0.1118 | 0.2614 | 0.2599    | 15.49   |
| 0.1329        | 29.0  | 4379 | 0.8275          | 0.303  | 0.1109 | 0.2734 | 0.2737    | 15.57   |
| 0.1226        | 30.0  | 4530 | 0.8273          | 0.2933 | 0.1173 | 0.2662 | 0.2653    | 15.53   |


### Framework versions

- Transformers 4.36.2
- Pytorch 1.12.1+cu113
- Datasets 2.15.0
- Tokenizers 0.15.0