limaatulya commited on
Commit
b72ea28
1 Parent(s): 11cf8b9

End of training

Browse files
README.md ADDED
@@ -0,0 +1,164 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: google-t5/t5-small
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - rouge
8
+ model-index:
9
+ - name: my_awesome_billsum_model_64
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # my_awesome_billsum_model_64
17
+
18
+ This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.9763
21
+ - Rouge1: 0.9612
22
+ - Rouge2: 0.844
23
+ - Rougel: 0.9033
24
+ - Rougelsum: 0.9017
25
+ - Gen Len: 5.0833
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 16
46
+ - eval_batch_size: 16
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - num_epochs: 100
51
+ - mixed_precision_training: Native AMP
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
57
+ | No log | 1.0 | 12 | 0.8485 | 0.9571 | 0.8119 | 0.8882 | 0.8859 | 5.0208 |
58
+ | No log | 2.0 | 24 | 0.8935 | 0.9571 | 0.8119 | 0.8882 | 0.8859 | 5.0208 |
59
+ | No log | 3.0 | 36 | 0.8809 | 0.9604 | 0.8177 | 0.887 | 0.884 | 5.0417 |
60
+ | No log | 4.0 | 48 | 0.8664 | 0.9604 | 0.8177 | 0.887 | 0.884 | 5.0417 |
61
+ | No log | 5.0 | 60 | 0.8449 | 0.9571 | 0.8259 | 0.8928 | 0.8902 | 5.0208 |
62
+ | No log | 6.0 | 72 | 0.8350 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
63
+ | No log | 7.0 | 84 | 0.8348 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
64
+ | No log | 8.0 | 96 | 0.8322 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
65
+ | No log | 9.0 | 108 | 0.8269 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
66
+ | No log | 10.0 | 120 | 0.8218 | 0.958 | 0.8311 | 0.8953 | 0.8925 | 5.0625 |
67
+ | No log | 11.0 | 132 | 0.8252 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
68
+ | No log | 12.0 | 144 | 0.8302 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
69
+ | No log | 13.0 | 156 | 0.8310 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
70
+ | No log | 14.0 | 168 | 0.8299 | 0.9633 | 0.852 | 0.9008 | 0.8974 | 5.0208 |
71
+ | No log | 15.0 | 180 | 0.8360 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
72
+ | No log | 16.0 | 192 | 0.8435 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
73
+ | No log | 17.0 | 204 | 0.8570 | 0.9603 | 0.8397 | 0.901 | 0.8987 | 5.0417 |
74
+ | No log | 18.0 | 216 | 0.8725 | 0.9571 | 0.8259 | 0.8928 | 0.8902 | 5.0208 |
75
+ | No log | 19.0 | 228 | 0.8580 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
76
+ | No log | 20.0 | 240 | 0.8545 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
77
+ | No log | 21.0 | 252 | 0.8630 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
78
+ | No log | 22.0 | 264 | 0.8652 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
79
+ | No log | 23.0 | 276 | 0.8782 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
80
+ | No log | 24.0 | 288 | 0.8781 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
81
+ | No log | 25.0 | 300 | 0.8863 | 0.9604 | 0.8324 | 0.8912 | 0.8885 | 5.0417 |
82
+ | No log | 26.0 | 312 | 0.8921 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
83
+ | No log | 27.0 | 324 | 0.8998 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
84
+ | No log | 28.0 | 336 | 0.8914 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
85
+ | No log | 29.0 | 348 | 0.8952 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
86
+ | No log | 30.0 | 360 | 0.9034 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
87
+ | No log | 31.0 | 372 | 0.9191 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
88
+ | No log | 32.0 | 384 | 0.9315 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
89
+ | No log | 33.0 | 396 | 0.9278 | 0.9633 | 0.8453 | 0.8997 | 0.8974 | 5.0625 |
90
+ | No log | 34.0 | 408 | 0.9266 | 0.9603 | 0.8397 | 0.901 | 0.8987 | 5.0417 |
91
+ | No log | 35.0 | 420 | 0.9362 | 0.9603 | 0.8397 | 0.901 | 0.8987 | 5.0417 |
92
+ | No log | 36.0 | 432 | 0.9378 | 0.9603 | 0.8397 | 0.901 | 0.8987 | 5.0417 |
93
+ | No log | 37.0 | 444 | 0.9359 | 0.9603 | 0.8397 | 0.901 | 0.8987 | 5.0417 |
94
+ | No log | 38.0 | 456 | 0.9397 | 0.9625 | 0.8409 | 0.8967 | 0.8942 | 5.0208 |
95
+ | No log | 39.0 | 468 | 0.9427 | 0.9625 | 0.8409 | 0.8967 | 0.8942 | 5.0208 |
96
+ | No log | 40.0 | 480 | 0.9438 | 0.9625 | 0.8409 | 0.8967 | 0.8942 | 5.0208 |
97
+ | No log | 41.0 | 492 | 0.9530 | 0.9625 | 0.8409 | 0.8967 | 0.8942 | 5.0208 |
98
+ | 0.0391 | 42.0 | 504 | 0.9583 | 0.9625 | 0.8409 | 0.8967 | 0.8942 | 5.0208 |
99
+ | 0.0391 | 43.0 | 516 | 0.9597 | 0.9625 | 0.8409 | 0.8967 | 0.8942 | 5.0208 |
100
+ | 0.0391 | 44.0 | 528 | 0.9534 | 0.9603 | 0.8397 | 0.901 | 0.8987 | 5.0417 |
101
+ | 0.0391 | 45.0 | 540 | 0.9508 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
102
+ | 0.0391 | 46.0 | 552 | 0.9519 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
103
+ | 0.0391 | 47.0 | 564 | 0.9433 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
104
+ | 0.0391 | 48.0 | 576 | 0.9401 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
105
+ | 0.0391 | 49.0 | 588 | 0.9506 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
106
+ | 0.0391 | 50.0 | 600 | 0.9630 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
107
+ | 0.0391 | 51.0 | 612 | 0.9651 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
108
+ | 0.0391 | 52.0 | 624 | 0.9641 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
109
+ | 0.0391 | 53.0 | 636 | 0.9592 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
110
+ | 0.0391 | 54.0 | 648 | 0.9584 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
111
+ | 0.0391 | 55.0 | 660 | 0.9574 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
112
+ | 0.0391 | 56.0 | 672 | 0.9594 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
113
+ | 0.0391 | 57.0 | 684 | 0.9616 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
114
+ | 0.0391 | 58.0 | 696 | 0.9607 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
115
+ | 0.0391 | 59.0 | 708 | 0.9563 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
116
+ | 0.0391 | 60.0 | 720 | 0.9615 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
117
+ | 0.0391 | 61.0 | 732 | 0.9628 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
118
+ | 0.0391 | 62.0 | 744 | 0.9678 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
119
+ | 0.0391 | 63.0 | 756 | 0.9699 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
120
+ | 0.0391 | 64.0 | 768 | 0.9694 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
121
+ | 0.0391 | 65.0 | 780 | 0.9663 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
122
+ | 0.0391 | 66.0 | 792 | 0.9755 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
123
+ | 0.0391 | 67.0 | 804 | 0.9824 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
124
+ | 0.0391 | 68.0 | 816 | 0.9811 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
125
+ | 0.0391 | 69.0 | 828 | 0.9752 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
126
+ | 0.0391 | 70.0 | 840 | 0.9725 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
127
+ | 0.0391 | 71.0 | 852 | 0.9733 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
128
+ | 0.0391 | 72.0 | 864 | 0.9741 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
129
+ | 0.0391 | 73.0 | 876 | 0.9743 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
130
+ | 0.0391 | 74.0 | 888 | 0.9746 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
131
+ | 0.0391 | 75.0 | 900 | 0.9726 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
132
+ | 0.0391 | 76.0 | 912 | 0.9732 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
133
+ | 0.0391 | 77.0 | 924 | 0.9741 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
134
+ | 0.0391 | 78.0 | 936 | 0.9759 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
135
+ | 0.0391 | 79.0 | 948 | 0.9796 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
136
+ | 0.0391 | 80.0 | 960 | 0.9808 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
137
+ | 0.0391 | 81.0 | 972 | 0.9815 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
138
+ | 0.0391 | 82.0 | 984 | 0.9797 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
139
+ | 0.0391 | 83.0 | 996 | 0.9789 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
140
+ | 0.0214 | 84.0 | 1008 | 0.9786 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
141
+ | 0.0214 | 85.0 | 1020 | 0.9810 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
142
+ | 0.0214 | 86.0 | 1032 | 0.9822 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
143
+ | 0.0214 | 87.0 | 1044 | 0.9831 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
144
+ | 0.0214 | 88.0 | 1056 | 0.9818 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
145
+ | 0.0214 | 89.0 | 1068 | 0.9814 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
146
+ | 0.0214 | 90.0 | 1080 | 0.9806 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
147
+ | 0.0214 | 91.0 | 1092 | 0.9805 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
148
+ | 0.0214 | 92.0 | 1104 | 0.9796 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
149
+ | 0.0214 | 93.0 | 1116 | 0.9786 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
150
+ | 0.0214 | 94.0 | 1128 | 0.9785 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
151
+ | 0.0214 | 95.0 | 1140 | 0.9793 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
152
+ | 0.0214 | 96.0 | 1152 | 0.9773 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
153
+ | 0.0214 | 97.0 | 1164 | 0.9767 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
154
+ | 0.0214 | 98.0 | 1176 | 0.9762 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
155
+ | 0.0214 | 99.0 | 1188 | 0.9765 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
156
+ | 0.0214 | 100.0 | 1200 | 0.9763 | 0.9612 | 0.844 | 0.9033 | 0.9017 | 5.0833 |
157
+
158
+
159
+ ### Framework versions
160
+
161
+ - Transformers 4.41.2
162
+ - Pytorch 2.3.0+cu121
163
+ - Datasets 2.20.0
164
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "decoder_start_token_id": 0,
3
+ "eos_token_id": 1,
4
+ "pad_token_id": 0,
5
+ "transformers_version": "4.41.2"
6
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:cdd9d0c545e9f3324b6fd083a62ced308bfacfdc2c4266744865ed8cce91cd5b
3
  size 242041896
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:95f08eb4504452c2c012764044a04c7dce31e3d77784c3dac48df745e0abc87d
3
  size 242041896
runs/Jun22_09-07-21_0de1f947a67a/events.out.tfevents.1719047243.0de1f947a67a.1794.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:92601d2637e921e64e0042e2f7ccc31de325183e23d2c0ad390c028aab5e91aa
3
- size 49862
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2108bb765029fa52827bbd276b0909fedfe6eb4b757b2b02db41806796c33108
3
+ size 59141