denizzhansahin commited on
Commit
473440b
1 Parent(s): 58b77c9

Upload model

Browse files
Files changed (1) hide show
  1. README.md +102 -12
README.md CHANGED
@@ -15,9 +15,9 @@ probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Train Loss: 0.6335
19
  - Validation Loss: 7.0703
20
- - Epoch: 9
21
 
22
  ## Model description
23
 
@@ -43,16 +43,106 @@ The following hyperparameters were used during training:
43
 
44
  | Train Loss | Validation Loss | Epoch |
45
  |:----------:|:---------------:|:-----:|
46
- | 0.6276 | 7.0703 | 0 |
47
- | 0.6363 | 7.0703 | 1 |
48
- | 0.6332 | 7.0703 | 2 |
49
- | 0.6360 | 7.0703 | 3 |
50
- | 0.6352 | 7.0703 | 4 |
51
- | 0.6372 | 7.0703 | 5 |
52
- | 0.6317 | 7.0703 | 6 |
53
- | 0.6348 | 7.0703 | 7 |
54
- | 0.6285 | 7.0703 | 8 |
55
- | 0.6335 | 7.0703 | 9 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
56
 
57
 
58
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Train Loss: 0.6315
19
  - Validation Loss: 7.0703
20
+ - Epoch: 99
21
 
22
  ## Model description
23
 
 
43
 
44
  | Train Loss | Validation Loss | Epoch |
45
  |:----------:|:---------------:|:-----:|
46
+ | 0.6331 | 7.0703 | 0 |
47
+ | 0.6319 | 7.0703 | 1 |
48
+ | 0.6283 | 7.0703 | 2 |
49
+ | 0.6276 | 7.0703 | 3 |
50
+ | 0.6295 | 7.0703 | 4 |
51
+ | 0.6356 | 7.0703 | 5 |
52
+ | 0.6282 | 7.0703 | 6 |
53
+ | 0.6287 | 7.0703 | 7 |
54
+ | 0.6309 | 7.0703 | 8 |
55
+ | 0.6291 | 7.0703 | 9 |
56
+ | 0.6320 | 7.0703 | 10 |
57
+ | 0.6284 | 7.0703 | 11 |
58
+ | 0.6333 | 7.0703 | 12 |
59
+ | 0.6302 | 7.0703 | 13 |
60
+ | 0.6346 | 7.0703 | 14 |
61
+ | 0.6285 | 7.0703 | 15 |
62
+ | 0.6248 | 7.0703 | 16 |
63
+ | 0.6317 | 7.0703 | 17 |
64
+ | 0.6291 | 7.0703 | 18 |
65
+ | 0.6305 | 7.0703 | 19 |
66
+ | 0.6321 | 7.0703 | 20 |
67
+ | 0.6317 | 7.0703 | 21 |
68
+ | 0.6274 | 7.0703 | 22 |
69
+ | 0.6283 | 7.0703 | 23 |
70
+ | 0.6359 | 7.0703 | 24 |
71
+ | 0.6334 | 7.0703 | 25 |
72
+ | 0.6306 | 7.0703 | 26 |
73
+ | 0.6375 | 7.0703 | 27 |
74
+ | 0.6267 | 7.0703 | 28 |
75
+ | 0.6349 | 7.0703 | 29 |
76
+ | 0.6298 | 7.0703 | 30 |
77
+ | 0.6314 | 7.0703 | 31 |
78
+ | 0.6347 | 7.0703 | 32 |
79
+ | 0.6284 | 7.0703 | 33 |
80
+ | 0.6300 | 7.0703 | 34 |
81
+ | 0.6287 | 7.0703 | 35 |
82
+ | 0.6337 | 7.0703 | 36 |
83
+ | 0.6348 | 7.0703 | 37 |
84
+ | 0.6297 | 7.0703 | 38 |
85
+ | 0.6376 | 7.0703 | 39 |
86
+ | 0.6340 | 7.0703 | 40 |
87
+ | 0.6311 | 7.0703 | 41 |
88
+ | 0.6327 | 7.0703 | 42 |
89
+ | 0.6343 | 7.0703 | 43 |
90
+ | 0.6297 | 7.0703 | 44 |
91
+ | 0.6316 | 7.0703 | 45 |
92
+ | 0.6302 | 7.0703 | 46 |
93
+ | 0.6324 | 7.0703 | 47 |
94
+ | 0.6355 | 7.0703 | 48 |
95
+ | 0.6278 | 7.0703 | 49 |
96
+ | 0.6324 | 7.0703 | 50 |
97
+ | 0.6332 | 7.0703 | 51 |
98
+ | 0.6294 | 7.0703 | 52 |
99
+ | 0.6348 | 7.0703 | 53 |
100
+ | 0.6288 | 7.0703 | 54 |
101
+ | 0.6332 | 7.0703 | 55 |
102
+ | 0.6334 | 7.0703 | 56 |
103
+ | 0.6302 | 7.0703 | 57 |
104
+ | 0.6287 | 7.0703 | 58 |
105
+ | 0.6274 | 7.0703 | 59 |
106
+ | 0.6272 | 7.0703 | 60 |
107
+ | 0.6264 | 7.0703 | 61 |
108
+ | 0.6298 | 7.0703 | 62 |
109
+ | 0.6275 | 7.0703 | 63 |
110
+ | 0.6315 | 7.0703 | 64 |
111
+ | 0.6293 | 7.0703 | 65 |
112
+ | 0.6325 | 7.0703 | 66 |
113
+ | 0.6277 | 7.0703 | 67 |
114
+ | 0.6292 | 7.0703 | 68 |
115
+ | 0.6254 | 7.0703 | 69 |
116
+ | 0.6351 | 7.0703 | 70 |
117
+ | 0.6362 | 7.0703 | 71 |
118
+ | 0.6312 | 7.0703 | 72 |
119
+ | 0.6307 | 7.0703 | 73 |
120
+ | 0.6260 | 7.0703 | 74 |
121
+ | 0.6289 | 7.0703 | 75 |
122
+ | 0.6333 | 7.0703 | 76 |
123
+ | 0.6259 | 7.0703 | 77 |
124
+ | 0.6270 | 7.0703 | 78 |
125
+ | 0.6300 | 7.0703 | 79 |
126
+ | 0.6321 | 7.0703 | 80 |
127
+ | 0.6352 | 7.0703 | 81 |
128
+ | 0.6283 | 7.0703 | 82 |
129
+ | 0.6377 | 7.0703 | 83 |
130
+ | 0.6291 | 7.0703 | 84 |
131
+ | 0.6263 | 7.0703 | 85 |
132
+ | 0.6302 | 7.0703 | 86 |
133
+ | 0.6336 | 7.0703 | 87 |
134
+ | 0.6326 | 7.0703 | 88 |
135
+ | 0.6365 | 7.0703 | 89 |
136
+ | 0.6328 | 7.0703 | 90 |
137
+ | 0.6281 | 7.0703 | 91 |
138
+ | 0.6360 | 7.0703 | 92 |
139
+ | 0.6347 | 7.0703 | 93 |
140
+ | 0.6318 | 7.0703 | 94 |
141
+ | 0.6334 | 7.0703 | 95 |
142
+ | 0.6349 | 7.0703 | 96 |
143
+ | 0.6274 | 7.0703 | 97 |
144
+ | 0.6266 | 7.0703 | 98 |
145
+ | 0.6315 | 7.0703 | 99 |
146
 
147
 
148
  ### Framework versions