ombarki345 commited on
Commit
00aef93
1 Parent(s): d810b80

End of training

Browse files
README.md CHANGED
@@ -17,11 +17,11 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.7745
21
- - Rouge1: 0.0903
22
- - Rouge2: 0.0432
23
- - Rougel: 0.0865
24
- - Rougelsum: 0.0865
25
  - Gen Len: 19.0
26
 
27
  ## Model description
@@ -54,56 +54,56 @@ The following hyperparameters were used during training:
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
57
- | No log | 1.0 | 15 | 4.4300 | 0.058 | 0.0211 | 0.0529 | 0.0536 | 19.0 |
58
- | No log | 2.0 | 30 | 3.7269 | 0.0757 | 0.0285 | 0.0678 | 0.0681 | 19.0 |
59
- | No log | 3.0 | 45 | 3.4625 | 0.0752 | 0.0347 | 0.0678 | 0.0673 | 19.0 |
60
- | No log | 4.0 | 60 | 3.2883 | 0.0809 | 0.0421 | 0.0742 | 0.0734 | 19.0 |
61
- | No log | 5.0 | 75 | 3.1337 | 0.0879 | 0.0444 | 0.0808 | 0.0807 | 19.0 |
62
- | No log | 6.0 | 90 | 2.9948 | 0.0893 | 0.0482 | 0.0822 | 0.0821 | 19.0 |
63
- | No log | 7.0 | 105 | 2.8729 | 0.0939 | 0.0528 | 0.0867 | 0.0864 | 19.0 |
64
- | No log | 8.0 | 120 | 2.7754 | 0.099 | 0.0547 | 0.0914 | 0.091 | 19.0 |
65
- | No log | 9.0 | 135 | 2.6912 | 0.0984 | 0.0551 | 0.0892 | 0.0891 | 19.0 |
66
- | No log | 10.0 | 150 | 2.6193 | 0.0968 | 0.055 | 0.0878 | 0.0879 | 19.0 |
67
- | No log | 11.0 | 165 | 2.5559 | 0.0999 | 0.0573 | 0.09 | 0.0902 | 19.0 |
68
- | No log | 12.0 | 180 | 2.4959 | 0.1008 | 0.0558 | 0.0911 | 0.0914 | 19.0 |
69
- | No log | 13.0 | 195 | 2.4405 | 0.1026 | 0.0569 | 0.0935 | 0.0936 | 19.0 |
70
- | No log | 14.0 | 210 | 2.3918 | 0.1034 | 0.0566 | 0.0941 | 0.0944 | 19.0 |
71
- | No log | 15.0 | 225 | 2.3471 | 0.1013 | 0.0519 | 0.0919 | 0.0921 | 19.0 |
72
- | No log | 16.0 | 240 | 2.2984 | 0.1075 | 0.0536 | 0.0974 | 0.0976 | 19.0 |
73
- | No log | 17.0 | 255 | 2.2582 | 0.1046 | 0.051 | 0.0944 | 0.0949 | 19.0 |
74
- | No log | 18.0 | 270 | 2.2159 | 0.1052 | 0.0515 | 0.094 | 0.0946 | 19.0 |
75
- | No log | 19.0 | 285 | 2.1784 | 0.0978 | 0.0445 | 0.0881 | 0.0879 | 19.0 |
76
- | No log | 20.0 | 300 | 2.1363 | 0.0982 | 0.0459 | 0.089 | 0.0887 | 19.0 |
77
- | No log | 21.0 | 315 | 2.0979 | 0.1024 | 0.0494 | 0.094 | 0.0946 | 19.0 |
78
- | No log | 22.0 | 330 | 2.0563 | 0.1105 | 0.0556 | 0.1012 | 0.1017 | 19.0 |
79
- | No log | 23.0 | 345 | 2.0139 | 0.109 | 0.0551 | 0.1 | 0.1005 | 19.0 |
80
- | No log | 24.0 | 360 | 1.9767 | 0.1092 | 0.0564 | 0.0998 | 0.1002 | 19.0 |
81
- | No log | 25.0 | 375 | 1.9463 | 0.1099 | 0.0563 | 0.0997 | 0.1009 | 19.0 |
82
- | No log | 26.0 | 390 | 1.9223 | 0.1024 | 0.0535 | 0.0949 | 0.0956 | 19.0 |
83
- | No log | 27.0 | 405 | 1.9032 | 0.1039 | 0.0546 | 0.0962 | 0.0968 | 19.0 |
84
- | No log | 28.0 | 420 | 1.8868 | 0.1053 | 0.0539 | 0.0964 | 0.0964 | 19.0 |
85
- | No log | 29.0 | 435 | 1.8732 | 0.1004 | 0.0525 | 0.0924 | 0.0927 | 19.0 |
86
- | No log | 30.0 | 450 | 1.8587 | 0.1004 | 0.0525 | 0.0924 | 0.0927 | 19.0 |
87
- | No log | 31.0 | 465 | 1.8474 | 0.0965 | 0.0491 | 0.0891 | 0.0892 | 19.0 |
88
- | No log | 32.0 | 480 | 1.8381 | 0.0942 | 0.0462 | 0.0873 | 0.0873 | 19.0 |
89
- | No log | 33.0 | 495 | 1.8310 | 0.1001 | 0.051 | 0.0932 | 0.093 | 19.0 |
90
- | 2.9242 | 34.0 | 510 | 1.8249 | 0.093 | 0.0457 | 0.0875 | 0.0877 | 19.0 |
91
- | 2.9242 | 35.0 | 525 | 1.8160 | 0.0898 | 0.0426 | 0.0842 | 0.0841 | 19.0 |
92
- | 2.9242 | 36.0 | 540 | 1.8102 | 0.0892 | 0.042 | 0.0844 | 0.0844 | 19.0 |
93
- | 2.9242 | 37.0 | 555 | 1.8050 | 0.0906 | 0.0421 | 0.0864 | 0.0863 | 19.0 |
94
- | 2.9242 | 38.0 | 570 | 1.8010 | 0.0906 | 0.0421 | 0.0864 | 0.0863 | 19.0 |
95
- | 2.9242 | 39.0 | 585 | 1.7953 | 0.0879 | 0.0386 | 0.0836 | 0.0835 | 19.0 |
96
- | 2.9242 | 40.0 | 600 | 1.7926 | 0.0906 | 0.0421 | 0.0864 | 0.0863 | 19.0 |
97
- | 2.9242 | 41.0 | 615 | 1.7886 | 0.0888 | 0.0421 | 0.0851 | 0.0848 | 19.0 |
98
- | 2.9242 | 42.0 | 630 | 1.7850 | 0.0894 | 0.0421 | 0.0857 | 0.0854 | 19.0 |
99
- | 2.9242 | 43.0 | 645 | 1.7823 | 0.0894 | 0.0421 | 0.0857 | 0.0854 | 19.0 |
100
- | 2.9242 | 44.0 | 660 | 1.7805 | 0.0889 | 0.0426 | 0.0851 | 0.0849 | 19.0 |
101
- | 2.9242 | 45.0 | 675 | 1.7790 | 0.0889 | 0.0426 | 0.0851 | 0.0849 | 19.0 |
102
- | 2.9242 | 46.0 | 690 | 1.7772 | 0.0903 | 0.0432 | 0.0865 | 0.0865 | 19.0 |
103
- | 2.9242 | 47.0 | 705 | 1.7759 | 0.0903 | 0.0432 | 0.0865 | 0.0865 | 19.0 |
104
- | 2.9242 | 48.0 | 720 | 1.7752 | 0.0903 | 0.0432 | 0.0865 | 0.0865 | 19.0 |
105
- | 2.9242 | 49.0 | 735 | 1.7747 | 0.0903 | 0.0432 | 0.0865 | 0.0865 | 19.0 |
106
- | 2.9242 | 50.0 | 750 | 1.7745 | 0.0903 | 0.0432 | 0.0865 | 0.0865 | 19.0 |
107
 
108
 
109
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.8717
21
+ - Rouge1: 0.1003
22
+ - Rouge2: 0.0529
23
+ - Rougel: 0.0944
24
+ - Rougelsum: 0.0946
25
  - Gen Len: 19.0
26
 
27
  ## Model description
 
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
57
+ | No log | 1.0 | 15 | 4.4926 | 0.0669 | 0.0329 | 0.0614 | 0.0607 | 19.0 |
58
+ | No log | 2.0 | 30 | 3.7442 | 0.0768 | 0.0345 | 0.0697 | 0.0692 | 19.0 |
59
+ | No log | 3.0 | 45 | 3.4920 | 0.0818 | 0.0375 | 0.072 | 0.0717 | 19.0 |
60
+ | No log | 4.0 | 60 | 3.3214 | 0.0928 | 0.0506 | 0.0841 | 0.0842 | 19.0 |
61
+ | No log | 5.0 | 75 | 3.1638 | 0.0954 | 0.0517 | 0.0868 | 0.0867 | 19.0 |
62
+ | No log | 6.0 | 90 | 3.0284 | 0.105 | 0.0598 | 0.0964 | 0.0961 | 19.0 |
63
+ | No log | 7.0 | 105 | 2.9106 | 0.1039 | 0.0587 | 0.0956 | 0.0951 | 19.0 |
64
+ | No log | 8.0 | 120 | 2.8210 | 0.1029 | 0.0587 | 0.095 | 0.0946 | 19.0 |
65
+ | No log | 9.0 | 135 | 2.7338 | 0.108 | 0.0655 | 0.1016 | 0.1008 | 19.0 |
66
+ | No log | 10.0 | 150 | 2.6612 | 0.1078 | 0.0656 | 0.1031 | 0.1031 | 19.0 |
67
+ | No log | 11.0 | 165 | 2.5973 | 0.1088 | 0.066 | 0.1038 | 0.1043 | 19.0 |
68
+ | No log | 12.0 | 180 | 2.5455 | 0.112 | 0.0676 | 0.1072 | 0.1074 | 19.0 |
69
+ | No log | 13.0 | 195 | 2.4925 | 0.1164 | 0.0694 | 0.1101 | 0.1107 | 19.0 |
70
+ | No log | 14.0 | 210 | 2.4458 | 0.1151 | 0.0696 | 0.1095 | 0.1101 | 19.0 |
71
+ | No log | 15.0 | 225 | 2.4013 | 0.1107 | 0.066 | 0.105 | 0.1058 | 19.0 |
72
+ | No log | 16.0 | 240 | 2.3636 | 0.1136 | 0.0668 | 0.1084 | 0.1089 | 19.0 |
73
+ | No log | 17.0 | 255 | 2.3236 | 0.1144 | 0.0654 | 0.1079 | 0.1083 | 19.0 |
74
+ | No log | 18.0 | 270 | 2.2813 | 0.116 | 0.0664 | 0.1104 | 0.1103 | 19.0 |
75
+ | No log | 19.0 | 285 | 2.2396 | 0.1101 | 0.0618 | 0.104 | 0.1047 | 19.0 |
76
+ | No log | 20.0 | 300 | 2.1998 | 0.1109 | 0.0626 | 0.1048 | 0.1055 | 19.0 |
77
+ | No log | 21.0 | 315 | 2.1595 | 0.1121 | 0.062 | 0.1049 | 0.1059 | 19.0 |
78
+ | No log | 22.0 | 330 | 2.1235 | 0.1111 | 0.0613 | 0.1043 | 0.1053 | 19.0 |
79
+ | No log | 23.0 | 345 | 2.0830 | 0.1154 | 0.0631 | 0.1099 | 0.1109 | 19.0 |
80
+ | No log | 24.0 | 360 | 2.0472 | 0.1094 | 0.0588 | 0.1037 | 0.1043 | 19.0 |
81
+ | No log | 25.0 | 375 | 2.0207 | 0.1103 | 0.0591 | 0.1053 | 0.1062 | 19.0 |
82
+ | No log | 26.0 | 390 | 1.9972 | 0.1037 | 0.0546 | 0.0999 | 0.1 | 19.0 |
83
+ | No log | 27.0 | 405 | 1.9808 | 0.1059 | 0.0531 | 0.099 | 0.0995 | 19.0 |
84
+ | No log | 28.0 | 420 | 1.9684 | 0.1074 | 0.0557 | 0.0998 | 0.1 | 19.0 |
85
+ | No log | 29.0 | 435 | 1.9560 | 0.1076 | 0.0563 | 0.0999 | 0.1001 | 19.0 |
86
+ | No log | 30.0 | 450 | 1.9452 | 0.1069 | 0.054 | 0.0991 | 0.0994 | 19.0 |
87
+ | No log | 31.0 | 465 | 1.9346 | 0.1042 | 0.0531 | 0.0968 | 0.097 | 19.0 |
88
+ | No log | 32.0 | 480 | 1.9259 | 0.1064 | 0.0558 | 0.0997 | 0.1 | 19.0 |
89
+ | No log | 33.0 | 495 | 1.9198 | 0.1056 | 0.0559 | 0.1003 | 0.1005 | 19.0 |
90
+ | 2.8916 | 34.0 | 510 | 1.9138 | 0.1052 | 0.0559 | 0.0994 | 0.0996 | 19.0 |
91
+ | 2.8916 | 35.0 | 525 | 1.9080 | 0.1012 | 0.0523 | 0.0954 | 0.0958 | 19.0 |
92
+ | 2.8916 | 36.0 | 540 | 1.9006 | 0.101 | 0.0532 | 0.0957 | 0.0961 | 19.0 |
93
+ | 2.8916 | 37.0 | 555 | 1.8964 | 0.0999 | 0.0528 | 0.0944 | 0.0947 | 19.0 |
94
+ | 2.8916 | 38.0 | 570 | 1.8920 | 0.1003 | 0.0534 | 0.0948 | 0.0951 | 19.0 |
95
+ | 2.8916 | 39.0 | 585 | 1.8884 | 0.1003 | 0.0534 | 0.0948 | 0.0951 | 19.0 |
96
+ | 2.8916 | 40.0 | 600 | 1.8856 | 0.1003 | 0.0534 | 0.0948 | 0.0952 | 19.0 |
97
+ | 2.8916 | 41.0 | 615 | 1.8827 | 0.1002 | 0.0528 | 0.0943 | 0.0945 | 19.0 |
98
+ | 2.8916 | 42.0 | 630 | 1.8808 | 0.1016 | 0.0528 | 0.095 | 0.0955 | 19.0 |
99
+ | 2.8916 | 43.0 | 645 | 1.8790 | 0.1016 | 0.0528 | 0.095 | 0.0956 | 19.0 |
100
+ | 2.8916 | 44.0 | 660 | 1.8767 | 0.1016 | 0.0528 | 0.095 | 0.0956 | 19.0 |
101
+ | 2.8916 | 45.0 | 675 | 1.8753 | 0.1016 | 0.0528 | 0.0951 | 0.0956 | 19.0 |
102
+ | 2.8916 | 46.0 | 690 | 1.8738 | 0.1016 | 0.0528 | 0.0951 | 0.0956 | 19.0 |
103
+ | 2.8916 | 47.0 | 705 | 1.8729 | 0.1016 | 0.0528 | 0.0951 | 0.0956 | 19.0 |
104
+ | 2.8916 | 48.0 | 720 | 1.8724 | 0.1003 | 0.0529 | 0.0944 | 0.0946 | 19.0 |
105
+ | 2.8916 | 49.0 | 735 | 1.8718 | 0.1003 | 0.0529 | 0.0944 | 0.0946 | 19.0 |
106
+ | 2.8916 | 50.0 | 750 | 1.8717 | 0.1003 | 0.0529 | 0.0944 | 0.0946 | 19.0 |
107
 
108
 
109
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3e3a59f13765c6bdb2d5fefee9ca3cadf956bc85cd294438f6b0b4a41a511437
3
  size 242041896
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:866805ff9ecb5b79952c40c0ae98cf6b9dd3b2b7d805f6a059b047ad03089321
3
  size 242041896
runs/Mar26_14-16-06_32da3d74d5a6/events.out.tfevents.1711462567.32da3d74d5a6.34.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:872c958b923ebbd8a7a699207239afa9c29a2f04db2cfa8921126f37f511877e
3
- size 23038
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:de66373f9a792b3bb38701c40a8a541c8c93530f7ff6bd7831289f4104feb133
3
+ size 32317