bytesizedllm commited on
Commit
4d9175f
1 Parent(s): 94c69f3

Model save

Browse files
README.md CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.0785
19
 
20
  ## Model description
21
 
@@ -34,7 +34,7 @@ More information needed
34
  ### Training hyperparameters
35
 
36
  The following hyperparameters were used during training:
37
- - learning_rate: 3e-05
38
  - train_batch_size: 16
39
  - eval_batch_size: 16
40
  - seed: 42
@@ -46,106 +46,106 @@ The following hyperparameters were used during training:
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
- | No log | 1.0 | 14 | 0.1091 |
50
- | No log | 2.0 | 28 | 0.1130 |
51
- | No log | 3.0 | 42 | 0.1122 |
52
- | No log | 4.0 | 56 | 0.1073 |
53
- | No log | 5.0 | 70 | 0.0929 |
54
- | No log | 6.0 | 84 | 0.0910 |
55
- | No log | 7.0 | 98 | 0.0926 |
56
- | No log | 8.0 | 112 | 0.1022 |
57
- | No log | 9.0 | 126 | 0.0937 |
58
- | No log | 10.0 | 140 | 0.0975 |
59
- | No log | 11.0 | 154 | 0.0950 |
60
- | No log | 12.0 | 168 | 0.1064 |
61
- | No log | 13.0 | 182 | 0.1137 |
62
- | No log | 14.0 | 196 | 0.0951 |
63
- | No log | 15.0 | 210 | 0.1074 |
64
- | No log | 16.0 | 224 | 0.1007 |
65
- | No log | 17.0 | 238 | 0.0919 |
66
- | No log | 18.0 | 252 | 0.0859 |
67
- | No log | 19.0 | 266 | 0.1020 |
68
- | No log | 20.0 | 280 | 0.0830 |
69
- | No log | 21.0 | 294 | 0.0839 |
70
- | No log | 22.0 | 308 | 0.0834 |
71
- | No log | 23.0 | 322 | 0.0824 |
72
- | No log | 24.0 | 336 | 0.0837 |
73
- | No log | 25.0 | 350 | 0.0915 |
74
- | No log | 26.0 | 364 | 0.0918 |
75
- | No log | 27.0 | 378 | 0.0827 |
76
- | No log | 28.0 | 392 | 0.0824 |
77
- | No log | 29.0 | 406 | 0.0816 |
78
- | No log | 30.0 | 420 | 0.0904 |
79
- | No log | 31.0 | 434 | 0.0872 |
80
- | No log | 32.0 | 448 | 0.0810 |
81
- | No log | 33.0 | 462 | 0.0817 |
82
- | No log | 34.0 | 476 | 0.0841 |
83
- | No log | 35.0 | 490 | 0.0826 |
84
- | 0.1061 | 36.0 | 504 | 0.0847 |
85
- | 0.1061 | 37.0 | 518 | 0.0830 |
86
- | 0.1061 | 38.0 | 532 | 0.0817 |
87
- | 0.1061 | 39.0 | 546 | 0.0833 |
88
- | 0.1061 | 40.0 | 560 | 0.0810 |
89
- | 0.1061 | 41.0 | 574 | 0.0859 |
90
- | 0.1061 | 42.0 | 588 | 0.0811 |
91
- | 0.1061 | 43.0 | 602 | 0.0802 |
92
- | 0.1061 | 44.0 | 616 | 0.0807 |
93
- | 0.1061 | 45.0 | 630 | 0.0806 |
94
- | 0.1061 | 46.0 | 644 | 0.0809 |
95
- | 0.1061 | 47.0 | 658 | 0.0800 |
96
- | 0.1061 | 48.0 | 672 | 0.0793 |
97
- | 0.1061 | 49.0 | 686 | 0.0801 |
98
- | 0.1061 | 50.0 | 700 | 0.0794 |
99
- | 0.1061 | 51.0 | 714 | 0.0836 |
100
- | 0.1061 | 52.0 | 728 | 0.0813 |
101
- | 0.1061 | 53.0 | 742 | 0.0803 |
102
- | 0.1061 | 54.0 | 756 | 0.0791 |
103
- | 0.1061 | 55.0 | 770 | 0.0798 |
104
- | 0.1061 | 56.0 | 784 | 0.0811 |
105
- | 0.1061 | 57.0 | 798 | 0.0811 |
106
- | 0.1061 | 58.0 | 812 | 0.0801 |
107
- | 0.1061 | 59.0 | 826 | 0.0800 |
108
- | 0.1061 | 60.0 | 840 | 0.0795 |
109
- | 0.1061 | 61.0 | 854 | 0.0796 |
110
- | 0.1061 | 62.0 | 868 | 0.0796 |
111
- | 0.1061 | 63.0 | 882 | 0.0799 |
112
- | 0.1061 | 64.0 | 896 | 0.0793 |
113
- | 0.1061 | 65.0 | 910 | 0.0791 |
114
- | 0.1061 | 66.0 | 924 | 0.0790 |
115
- | 0.1061 | 67.0 | 938 | 0.0790 |
116
- | 0.1061 | 68.0 | 952 | 0.0789 |
117
- | 0.1061 | 69.0 | 966 | 0.0790 |
118
- | 0.1061 | 70.0 | 980 | 0.0790 |
119
- | 0.1061 | 71.0 | 994 | 0.0789 |
120
- | 0.088 | 72.0 | 1008 | 0.0789 |
121
- | 0.088 | 73.0 | 1022 | 0.0789 |
122
- | 0.088 | 74.0 | 1036 | 0.0788 |
123
- | 0.088 | 75.0 | 1050 | 0.0788 |
124
- | 0.088 | 76.0 | 1064 | 0.0788 |
125
- | 0.088 | 77.0 | 1078 | 0.0787 |
126
- | 0.088 | 78.0 | 1092 | 0.0787 |
127
- | 0.088 | 79.0 | 1106 | 0.0787 |
128
- | 0.088 | 80.0 | 1120 | 0.0786 |
129
- | 0.088 | 81.0 | 1134 | 0.0787 |
130
- | 0.088 | 82.0 | 1148 | 0.0790 |
131
- | 0.088 | 83.0 | 1162 | 0.0787 |
132
- | 0.088 | 84.0 | 1176 | 0.0787 |
133
- | 0.088 | 85.0 | 1190 | 0.0787 |
134
- | 0.088 | 86.0 | 1204 | 0.0787 |
135
- | 0.088 | 87.0 | 1218 | 0.0789 |
136
- | 0.088 | 88.0 | 1232 | 0.0789 |
137
- | 0.088 | 89.0 | 1246 | 0.0789 |
138
- | 0.088 | 90.0 | 1260 | 0.0788 |
139
- | 0.088 | 91.0 | 1274 | 0.0788 |
140
- | 0.088 | 92.0 | 1288 | 0.0786 |
141
- | 0.088 | 93.0 | 1302 | 0.0786 |
142
- | 0.088 | 94.0 | 1316 | 0.0785 |
143
- | 0.088 | 95.0 | 1330 | 0.0785 |
144
- | 0.088 | 96.0 | 1344 | 0.0785 |
145
- | 0.088 | 97.0 | 1358 | 0.0785 |
146
- | 0.088 | 98.0 | 1372 | 0.0785 |
147
- | 0.088 | 99.0 | 1386 | 0.0785 |
148
- | 0.088 | 100.0 | 1400 | 0.0785 |
149
 
150
 
151
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.0737
19
 
20
  ## Model description
21
 
 
34
  ### Training hyperparameters
35
 
36
  The following hyperparameters were used during training:
37
+ - learning_rate: 2e-05
38
  - train_batch_size: 16
39
  - eval_batch_size: 16
40
  - seed: 42
 
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
+ | No log | 1.0 | 14 | 3.6840 |
50
+ | No log | 2.0 | 28 | 1.5996 |
51
+ | No log | 3.0 | 42 | 0.9961 |
52
+ | No log | 4.0 | 56 | 0.7927 |
53
+ | No log | 5.0 | 70 | 0.6597 |
54
+ | No log | 6.0 | 84 | 0.5352 |
55
+ | No log | 7.0 | 98 | 0.4412 |
56
+ | No log | 8.0 | 112 | 0.3435 |
57
+ | No log | 9.0 | 126 | 0.2955 |
58
+ | No log | 10.0 | 140 | 0.2741 |
59
+ | No log | 11.0 | 154 | 0.2211 |
60
+ | No log | 12.0 | 168 | 0.1959 |
61
+ | No log | 13.0 | 182 | 0.1783 |
62
+ | No log | 14.0 | 196 | 0.1919 |
63
+ | No log | 15.0 | 210 | 0.1640 |
64
+ | No log | 16.0 | 224 | 0.1439 |
65
+ | No log | 17.0 | 238 | 0.1479 |
66
+ | No log | 18.0 | 252 | 0.1536 |
67
+ | No log | 19.0 | 266 | 0.1365 |
68
+ | No log | 20.0 | 280 | 0.1444 |
69
+ | No log | 21.0 | 294 | 0.1268 |
70
+ | No log | 22.0 | 308 | 0.1330 |
71
+ | No log | 23.0 | 322 | 0.1192 |
72
+ | No log | 24.0 | 336 | 0.1254 |
73
+ | No log | 25.0 | 350 | 0.1168 |
74
+ | No log | 26.0 | 364 | 0.1099 |
75
+ | No log | 27.0 | 378 | 0.1077 |
76
+ | No log | 28.0 | 392 | 0.1134 |
77
+ | No log | 29.0 | 406 | 0.1039 |
78
+ | No log | 30.0 | 420 | 0.1293 |
79
+ | No log | 31.0 | 434 | 0.1211 |
80
+ | No log | 32.0 | 448 | 0.0997 |
81
+ | No log | 33.0 | 462 | 0.1052 |
82
+ | No log | 34.0 | 476 | 0.1067 |
83
+ | No log | 35.0 | 490 | 0.0974 |
84
+ | 0.5014 | 36.0 | 504 | 0.0987 |
85
+ | 0.5014 | 37.0 | 518 | 0.0955 |
86
+ | 0.5014 | 38.0 | 532 | 0.0938 |
87
+ | 0.5014 | 39.0 | 546 | 0.0894 |
88
+ | 0.5014 | 40.0 | 560 | 0.0873 |
89
+ | 0.5014 | 41.0 | 574 | 0.0943 |
90
+ | 0.5014 | 42.0 | 588 | 0.0917 |
91
+ | 0.5014 | 43.0 | 602 | 0.0869 |
92
+ | 0.5014 | 44.0 | 616 | 0.0896 |
93
+ | 0.5014 | 45.0 | 630 | 0.0857 |
94
+ | 0.5014 | 46.0 | 644 | 0.0889 |
95
+ | 0.5014 | 47.0 | 658 | 0.0854 |
96
+ | 0.5014 | 48.0 | 672 | 0.0896 |
97
+ | 0.5014 | 49.0 | 686 | 0.0848 |
98
+ | 0.5014 | 50.0 | 700 | 0.0882 |
99
+ | 0.5014 | 51.0 | 714 | 0.0840 |
100
+ | 0.5014 | 52.0 | 728 | 0.0826 |
101
+ | 0.5014 | 53.0 | 742 | 0.0843 |
102
+ | 0.5014 | 54.0 | 756 | 0.0823 |
103
+ | 0.5014 | 55.0 | 770 | 0.0805 |
104
+ | 0.5014 | 56.0 | 784 | 0.0799 |
105
+ | 0.5014 | 57.0 | 798 | 0.0776 |
106
+ | 0.5014 | 58.0 | 812 | 0.0775 |
107
+ | 0.5014 | 59.0 | 826 | 0.0776 |
108
+ | 0.5014 | 60.0 | 840 | 0.0761 |
109
+ | 0.5014 | 61.0 | 854 | 0.0756 |
110
+ | 0.5014 | 62.0 | 868 | 0.0764 |
111
+ | 0.5014 | 63.0 | 882 | 0.0768 |
112
+ | 0.5014 | 64.0 | 896 | 0.0764 |
113
+ | 0.5014 | 65.0 | 910 | 0.0770 |
114
+ | 0.5014 | 66.0 | 924 | 0.0766 |
115
+ | 0.5014 | 67.0 | 938 | 0.0776 |
116
+ | 0.5014 | 68.0 | 952 | 0.0752 |
117
+ | 0.5014 | 69.0 | 966 | 0.0762 |
118
+ | 0.5014 | 70.0 | 980 | 0.0764 |
119
+ | 0.5014 | 71.0 | 994 | 0.0747 |
120
+ | 0.0961 | 72.0 | 1008 | 0.0762 |
121
+ | 0.0961 | 73.0 | 1022 | 0.0767 |
122
+ | 0.0961 | 74.0 | 1036 | 0.0766 |
123
+ | 0.0961 | 75.0 | 1050 | 0.0767 |
124
+ | 0.0961 | 76.0 | 1064 | 0.0755 |
125
+ | 0.0961 | 77.0 | 1078 | 0.0755 |
126
+ | 0.0961 | 78.0 | 1092 | 0.0751 |
127
+ | 0.0961 | 79.0 | 1106 | 0.0747 |
128
+ | 0.0961 | 80.0 | 1120 | 0.0756 |
129
+ | 0.0961 | 81.0 | 1134 | 0.0752 |
130
+ | 0.0961 | 82.0 | 1148 | 0.0751 |
131
+ | 0.0961 | 83.0 | 1162 | 0.0749 |
132
+ | 0.0961 | 84.0 | 1176 | 0.0748 |
133
+ | 0.0961 | 85.0 | 1190 | 0.0744 |
134
+ | 0.0961 | 86.0 | 1204 | 0.0742 |
135
+ | 0.0961 | 87.0 | 1218 | 0.0747 |
136
+ | 0.0961 | 88.0 | 1232 | 0.0745 |
137
+ | 0.0961 | 89.0 | 1246 | 0.0739 |
138
+ | 0.0961 | 90.0 | 1260 | 0.0738 |
139
+ | 0.0961 | 91.0 | 1274 | 0.0739 |
140
+ | 0.0961 | 92.0 | 1288 | 0.0740 |
141
+ | 0.0961 | 93.0 | 1302 | 0.0738 |
142
+ | 0.0961 | 94.0 | 1316 | 0.0738 |
143
+ | 0.0961 | 95.0 | 1330 | 0.0737 |
144
+ | 0.0961 | 96.0 | 1344 | 0.0736 |
145
+ | 0.0961 | 97.0 | 1358 | 0.0737 |
146
+ | 0.0961 | 98.0 | 1372 | 0.0737 |
147
+ | 0.0961 | 99.0 | 1386 | 0.0737 |
148
+ | 0.0961 | 100.0 | 1400 | 0.0737 |
149
 
150
 
151
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d80aea9b366f54daee8bc85753f052d0ecc14ed1758ec3b6c782580b77faa4c6
3
  size 265470032
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:67b0a7664370109109ab3c38df110d7370c3d75baa7d7a058a9c2d62381c4352
3
  size 265470032
runs/Apr15_23-27-20_asrlytics-server/events.out.tfevents.1713203845.asrlytics-server.42534.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2a77e1a34ff0e1098129fd947e90393ad41bd03a43e184a2fcb61ea3c617a489
3
- size 14392
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e566a1f1d3a4ebac82923040c7aa6b05efa92d3a75547baa589a7965e654577e
3
+ size 32301