pig4431 commited on
Commit
cae61fa
1 Parent(s): 6c0de0b

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +166 -0
README.md ADDED
@@ -0,0 +1,166 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - amazon_polarity
7
+ metrics:
8
+ - accuracy
9
+ model-index:
10
+ - name: amazonPolarity_roBERTa_5E
11
+ results:
12
+ - task:
13
+ name: Text Classification
14
+ type: text-classification
15
+ dataset:
16
+ name: amazon_polarity
17
+ type: amazon_polarity
18
+ config: amazon_polarity
19
+ split: train
20
+ args: amazon_polarity
21
+ metrics:
22
+ - name: Accuracy
23
+ type: accuracy
24
+ value: 0.96
25
+ ---
26
+
27
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
+ should probably proofread and complete it, then remove this comment. -->
29
+
30
+ # amazonPolarity_roBERTa_5E
31
+
32
+ This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the amazon_polarity dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - Loss: 0.2201
35
+ - Accuracy: 0.96
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 1e-05
55
+ - train_batch_size: 32
56
+ - eval_batch_size: 8
57
+ - seed: 42
58
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
+ - lr_scheduler_type: linear
60
+ - num_epochs: 5
61
+
62
+ ### Training results
63
+
64
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
65
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
66
+ | 0.5785 | 0.05 | 50 | 0.2706 | 0.9133 |
67
+ | 0.2731 | 0.11 | 100 | 0.2379 | 0.9267 |
68
+ | 0.2223 | 0.16 | 150 | 0.1731 | 0.92 |
69
+ | 0.1887 | 0.21 | 200 | 0.1672 | 0.9267 |
70
+ | 0.1915 | 0.27 | 250 | 0.2946 | 0.9067 |
71
+ | 0.1981 | 0.32 | 300 | 0.1744 | 0.9267 |
72
+ | 0.1617 | 0.37 | 350 | 0.2349 | 0.92 |
73
+ | 0.1919 | 0.43 | 400 | 0.1605 | 0.9333 |
74
+ | 0.1713 | 0.48 | 450 | 0.1626 | 0.94 |
75
+ | 0.1961 | 0.53 | 500 | 0.1555 | 0.9467 |
76
+ | 0.1652 | 0.59 | 550 | 0.1996 | 0.94 |
77
+ | 0.1719 | 0.64 | 600 | 0.1848 | 0.9333 |
78
+ | 0.159 | 0.69 | 650 | 0.1783 | 0.9467 |
79
+ | 0.1533 | 0.75 | 700 | 0.2016 | 0.9467 |
80
+ | 0.1749 | 0.8 | 750 | 0.3943 | 0.8733 |
81
+ | 0.1675 | 0.85 | 800 | 0.1948 | 0.9133 |
82
+ | 0.1601 | 0.91 | 850 | 0.2044 | 0.92 |
83
+ | 0.1424 | 0.96 | 900 | 0.1061 | 0.9533 |
84
+ | 0.1447 | 1.01 | 950 | 0.2195 | 0.9267 |
85
+ | 0.0997 | 1.07 | 1000 | 0.2102 | 0.9333 |
86
+ | 0.1454 | 1.12 | 1050 | 0.1648 | 0.9467 |
87
+ | 0.1326 | 1.17 | 1100 | 0.2774 | 0.9 |
88
+ | 0.1192 | 1.23 | 1150 | 0.1337 | 0.96 |
89
+ | 0.1429 | 1.28 | 1200 | 0.1451 | 0.96 |
90
+ | 0.1227 | 1.33 | 1250 | 0.1995 | 0.94 |
91
+ | 0.1343 | 1.39 | 1300 | 0.2115 | 0.92 |
92
+ | 0.1208 | 1.44 | 1350 | 0.1832 | 0.9467 |
93
+ | 0.1314 | 1.49 | 1400 | 0.1298 | 0.96 |
94
+ | 0.1069 | 1.55 | 1450 | 0.1778 | 0.94 |
95
+ | 0.126 | 1.6 | 1500 | 0.1205 | 0.9667 |
96
+ | 0.1162 | 1.65 | 1550 | 0.1569 | 0.9533 |
97
+ | 0.0961 | 1.71 | 1600 | 0.1865 | 0.9467 |
98
+ | 0.13 | 1.76 | 1650 | 0.1458 | 0.96 |
99
+ | 0.1206 | 1.81 | 1700 | 0.1648 | 0.96 |
100
+ | 0.1096 | 1.87 | 1750 | 0.2221 | 0.9333 |
101
+ | 0.1138 | 1.92 | 1800 | 0.1727 | 0.9533 |
102
+ | 0.1258 | 1.97 | 1850 | 0.2036 | 0.9467 |
103
+ | 0.1032 | 2.03 | 1900 | 0.1710 | 0.9667 |
104
+ | 0.082 | 2.08 | 1950 | 0.2380 | 0.9467 |
105
+ | 0.101 | 2.13 | 2000 | 0.1868 | 0.9533 |
106
+ | 0.0913 | 2.19 | 2050 | 0.2934 | 0.9267 |
107
+ | 0.0859 | 2.24 | 2100 | 0.2385 | 0.9333 |
108
+ | 0.1019 | 2.29 | 2150 | 0.1697 | 0.9667 |
109
+ | 0.1069 | 2.35 | 2200 | 0.1815 | 0.94 |
110
+ | 0.0805 | 2.4 | 2250 | 0.2185 | 0.9467 |
111
+ | 0.0906 | 2.45 | 2300 | 0.1923 | 0.96 |
112
+ | 0.105 | 2.51 | 2350 | 0.1720 | 0.96 |
113
+ | 0.0866 | 2.56 | 2400 | 0.1710 | 0.96 |
114
+ | 0.0821 | 2.61 | 2450 | 0.2267 | 0.9533 |
115
+ | 0.107 | 2.67 | 2500 | 0.2203 | 0.9467 |
116
+ | 0.0841 | 2.72 | 2550 | 0.1621 | 0.9533 |
117
+ | 0.0811 | 2.77 | 2600 | 0.1954 | 0.9533 |
118
+ | 0.1077 | 2.83 | 2650 | 0.2107 | 0.9533 |
119
+ | 0.0771 | 2.88 | 2700 | 0.2398 | 0.9467 |
120
+ | 0.08 | 2.93 | 2750 | 0.1816 | 0.96 |
121
+ | 0.0827 | 2.99 | 2800 | 0.2311 | 0.9467 |
122
+ | 0.1118 | 3.04 | 2850 | 0.1825 | 0.96 |
123
+ | 0.0626 | 3.09 | 2900 | 0.2876 | 0.9333 |
124
+ | 0.0733 | 3.14 | 2950 | 0.2045 | 0.9467 |
125
+ | 0.0554 | 3.2 | 3000 | 0.1775 | 0.96 |
126
+ | 0.0569 | 3.25 | 3050 | 0.2208 | 0.9467 |
127
+ | 0.0566 | 3.3 | 3100 | 0.2113 | 0.9533 |
128
+ | 0.063 | 3.36 | 3150 | 0.2013 | 0.96 |
129
+ | 0.056 | 3.41 | 3200 | 0.2229 | 0.96 |
130
+ | 0.0791 | 3.46 | 3250 | 0.2472 | 0.9467 |
131
+ | 0.0867 | 3.52 | 3300 | 0.1630 | 0.9667 |
132
+ | 0.0749 | 3.57 | 3350 | 0.2066 | 0.9533 |
133
+ | 0.0653 | 3.62 | 3400 | 0.2085 | 0.96 |
134
+ | 0.0784 | 3.68 | 3450 | 0.2068 | 0.9467 |
135
+ | 0.074 | 3.73 | 3500 | 0.1976 | 0.96 |
136
+ | 0.076 | 3.78 | 3550 | 0.1953 | 0.9533 |
137
+ | 0.0807 | 3.84 | 3600 | 0.2246 | 0.9467 |
138
+ | 0.077 | 3.89 | 3650 | 0.1867 | 0.9533 |
139
+ | 0.0771 | 3.94 | 3700 | 0.2035 | 0.9533 |
140
+ | 0.0658 | 4.0 | 3750 | 0.1754 | 0.9667 |
141
+ | 0.0711 | 4.05 | 3800 | 0.1977 | 0.9667 |
142
+ | 0.066 | 4.1 | 3850 | 0.1806 | 0.9667 |
143
+ | 0.0627 | 4.16 | 3900 | 0.1819 | 0.96 |
144
+ | 0.0671 | 4.21 | 3950 | 0.2247 | 0.9533 |
145
+ | 0.0245 | 4.26 | 4000 | 0.2482 | 0.9467 |
146
+ | 0.0372 | 4.32 | 4050 | 0.2201 | 0.96 |
147
+ | 0.0607 | 4.37 | 4100 | 0.2381 | 0.9467 |
148
+ | 0.0689 | 4.42 | 4150 | 0.2159 | 0.96 |
149
+ | 0.0383 | 4.48 | 4200 | 0.2278 | 0.9533 |
150
+ | 0.0382 | 4.53 | 4250 | 0.2277 | 0.96 |
151
+ | 0.0626 | 4.58 | 4300 | 0.2325 | 0.96 |
152
+ | 0.0595 | 4.64 | 4350 | 0.2315 | 0.96 |
153
+ | 0.0578 | 4.69 | 4400 | 0.2284 | 0.96 |
154
+ | 0.0324 | 4.74 | 4450 | 0.2297 | 0.96 |
155
+ | 0.0476 | 4.8 | 4500 | 0.2154 | 0.96 |
156
+ | 0.0309 | 4.85 | 4550 | 0.2258 | 0.96 |
157
+ | 0.0748 | 4.9 | 4600 | 0.2131 | 0.96 |
158
+ | 0.0731 | 4.96 | 4650 | 0.2201 | 0.96 |
159
+
160
+
161
+ ### Framework versions
162
+
163
+ - Transformers 4.24.0
164
+ - Pytorch 1.13.0
165
+ - Datasets 2.6.1
166
+ - Tokenizers 0.13.1