Joshwabail commited on
Commit
10ed594
1 Parent(s): 9f5cae4

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +155 -0
README.md ADDED
@@ -0,0 +1,155 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: gpt2_finetuned_wolfram
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # gpt2_finetuned_wolfram
14
+
15
+ This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 5.2595
18
+
19
+ ## Model description
20
+
21
+ More information needed
22
+
23
+ ## Intended uses & limitations
24
+
25
+ More information needed
26
+
27
+ ## Training and evaluation data
28
+
29
+ More information needed
30
+
31
+ ## Training procedure
32
+
33
+ ### Training hyperparameters
34
+
35
+ The following hyperparameters were used during training:
36
+ - learning_rate: 0.0005
37
+ - train_batch_size: 8
38
+ - eval_batch_size: 8
39
+ - seed: 42
40
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
+ - lr_scheduler_type: cosine
42
+ - num_epochs: 100
43
+
44
+ ### Training results
45
+
46
+ | Training Loss | Epoch | Step | Validation Loss |
47
+ |:-------------:|:-----:|:-----:|:---------------:|
48
+ | No log | 1.0 | 113 | 6.3789 |
49
+ | No log | 2.0 | 226 | 6.0746 |
50
+ | No log | 3.0 | 339 | 5.7649 |
51
+ | No log | 4.0 | 452 | 5.4453 |
52
+ | 5.9875 | 5.0 | 565 | 5.2142 |
53
+ | 5.9875 | 6.0 | 678 | 5.0967 |
54
+ | 5.9875 | 7.0 | 791 | 5.0143 |
55
+ | 5.9875 | 8.0 | 904 | 4.9429 |
56
+ | 4.5754 | 9.0 | 1017 | 4.8936 |
57
+ | 4.5754 | 10.0 | 1130 | 4.8722 |
58
+ | 4.5754 | 11.0 | 1243 | 4.8700 |
59
+ | 4.5754 | 12.0 | 1356 | 4.8362 |
60
+ | 4.5754 | 13.0 | 1469 | 4.8246 |
61
+ | 4.0366 | 14.0 | 1582 | 4.8242 |
62
+ | 4.0366 | 15.0 | 1695 | 4.8149 |
63
+ | 4.0366 | 16.0 | 1808 | 4.8062 |
64
+ | 4.0366 | 17.0 | 1921 | 4.8065 |
65
+ | 3.8118 | 18.0 | 2034 | 4.8288 |
66
+ | 3.8118 | 19.0 | 2147 | 4.8035 |
67
+ | 3.8118 | 20.0 | 2260 | 4.8009 |
68
+ | 3.8118 | 21.0 | 2373 | 4.7835 |
69
+ | 3.8118 | 22.0 | 2486 | 4.7865 |
70
+ | 3.6394 | 23.0 | 2599 | 4.7833 |
71
+ | 3.6394 | 24.0 | 2712 | 4.7776 |
72
+ | 3.6394 | 25.0 | 2825 | 4.8030 |
73
+ | 3.6394 | 26.0 | 2938 | 4.7684 |
74
+ | 3.5105 | 27.0 | 3051 | 4.7724 |
75
+ | 3.5105 | 28.0 | 3164 | 4.7803 |
76
+ | 3.5105 | 29.0 | 3277 | 4.7792 |
77
+ | 3.5105 | 30.0 | 3390 | 4.8027 |
78
+ | 3.38 | 31.0 | 3503 | 4.8000 |
79
+ | 3.38 | 32.0 | 3616 | 4.8046 |
80
+ | 3.38 | 33.0 | 3729 | 4.7751 |
81
+ | 3.38 | 34.0 | 3842 | 4.7774 |
82
+ | 3.38 | 35.0 | 3955 | 4.7733 |
83
+ | 3.2382 | 36.0 | 4068 | 4.7886 |
84
+ | 3.2382 | 37.0 | 4181 | 4.7892 |
85
+ | 3.2382 | 38.0 | 4294 | 4.7876 |
86
+ | 3.2382 | 39.0 | 4407 | 4.7965 |
87
+ | 3.1022 | 40.0 | 4520 | 4.7879 |
88
+ | 3.1022 | 41.0 | 4633 | 4.7829 |
89
+ | 3.1022 | 42.0 | 4746 | 4.7884 |
90
+ | 3.1022 | 43.0 | 4859 | 4.7845 |
91
+ | 3.1022 | 44.0 | 4972 | 4.8193 |
92
+ | 2.9571 | 45.0 | 5085 | 4.7947 |
93
+ | 2.9571 | 46.0 | 5198 | 4.7968 |
94
+ | 2.9571 | 47.0 | 5311 | 4.7894 |
95
+ | 2.9571 | 48.0 | 5424 | 4.7892 |
96
+ | 2.7555 | 49.0 | 5537 | 4.7914 |
97
+ | 2.7555 | 50.0 | 5650 | 4.8099 |
98
+ | 2.7555 | 51.0 | 5763 | 4.8029 |
99
+ | 2.7555 | 52.0 | 5876 | 4.8000 |
100
+ | 2.7555 | 53.0 | 5989 | 4.8092 |
101
+ | 2.5656 | 54.0 | 6102 | 4.8111 |
102
+ | 2.5656 | 55.0 | 6215 | 4.8257 |
103
+ | 2.5656 | 56.0 | 6328 | 4.8109 |
104
+ | 2.5656 | 57.0 | 6441 | 4.8457 |
105
+ | 2.3501 | 58.0 | 6554 | 4.8428 |
106
+ | 2.3501 | 59.0 | 6667 | 4.8519 |
107
+ | 2.3501 | 60.0 | 6780 | 4.8652 |
108
+ | 2.3501 | 61.0 | 6893 | 4.8788 |
109
+ | 2.141 | 62.0 | 7006 | 4.8910 |
110
+ | 2.141 | 63.0 | 7119 | 4.8928 |
111
+ | 2.141 | 64.0 | 7232 | 4.9112 |
112
+ | 2.141 | 65.0 | 7345 | 4.9219 |
113
+ | 2.141 | 66.0 | 7458 | 4.9403 |
114
+ | 1.9122 | 67.0 | 7571 | 4.9585 |
115
+ | 1.9122 | 68.0 | 7684 | 4.9726 |
116
+ | 1.9122 | 69.0 | 7797 | 4.9904 |
117
+ | 1.9122 | 70.0 | 7910 | 5.0118 |
118
+ | 1.7176 | 71.0 | 8023 | 5.0129 |
119
+ | 1.7176 | 72.0 | 8136 | 5.0303 |
120
+ | 1.7176 | 73.0 | 8249 | 5.0529 |
121
+ | 1.7176 | 74.0 | 8362 | 5.0610 |
122
+ | 1.7176 | 75.0 | 8475 | 5.0821 |
123
+ | 1.5292 | 76.0 | 8588 | 5.0931 |
124
+ | 1.5292 | 77.0 | 8701 | 5.1154 |
125
+ | 1.5292 | 78.0 | 8814 | 5.1319 |
126
+ | 1.5292 | 79.0 | 8927 | 5.1394 |
127
+ | 1.3843 | 80.0 | 9040 | 5.1529 |
128
+ | 1.3843 | 81.0 | 9153 | 5.1711 |
129
+ | 1.3843 | 82.0 | 9266 | 5.1802 |
130
+ | 1.3843 | 83.0 | 9379 | 5.1952 |
131
+ | 1.3843 | 84.0 | 9492 | 5.2088 |
132
+ | 1.2643 | 85.0 | 9605 | 5.2170 |
133
+ | 1.2643 | 86.0 | 9718 | 5.2160 |
134
+ | 1.2643 | 87.0 | 9831 | 5.2267 |
135
+ | 1.2643 | 88.0 | 9944 | 5.2346 |
136
+ | 1.1928 | 89.0 | 10057 | 5.2418 |
137
+ | 1.1928 | 90.0 | 10170 | 5.2463 |
138
+ | 1.1928 | 91.0 | 10283 | 5.2505 |
139
+ | 1.1928 | 92.0 | 10396 | 5.2522 |
140
+ | 1.1556 | 93.0 | 10509 | 5.2538 |
141
+ | 1.1556 | 94.0 | 10622 | 5.2557 |
142
+ | 1.1556 | 95.0 | 10735 | 5.2566 |
143
+ | 1.1556 | 96.0 | 10848 | 5.2585 |
144
+ | 1.1556 | 97.0 | 10961 | 5.2594 |
145
+ | 1.1268 | 98.0 | 11074 | 5.2596 |
146
+ | 1.1268 | 99.0 | 11187 | 5.2595 |
147
+ | 1.1268 | 100.0 | 11300 | 5.2595 |
148
+
149
+
150
+ ### Framework versions
151
+
152
+ - Transformers 4.26.1
153
+ - Pytorch 1.13.1+cu116
154
+ - Datasets 2.10.0
155
+ - Tokenizers 0.13.2