Funnyworld1412 commited on
Commit
e53f1bf
1 Parent(s): 741f5c7

Add SetFit ABSA model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,480 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: setfit
3
+ tags:
4
+ - setfit
5
+ - absa
6
+ - sentence-transformers
7
+ - text-classification
8
+ - generated_from_setfit_trainer
9
+ metrics:
10
+ - accuracy
11
+ widget:
12
+ - text: penambahan jumlah max resin:update qol loadout artefak, skip story, ringkasan
13
+ story jika di skip, dan penambahan jumlah max resin mana min game udah 3 tahun
14
+ gini gini aja gak ada perkembangan. apalagi hadiah untuk pemain selama 3 tahun
15
+ tidak ada peningkatan
16
+ - text: dialognya:adain fitur skip dialog gak penting , capek tangan mencetin layar
17
+ doang , mana panjang , dialognya juga ga nyambung sama cerita aslinya ini
18
+ - text: anak anak:istilah game kikir itu emang benar sih buat game ini, parah ngabisin
19
+ waktu disuruh nguli trosss hadiah gak seberapa, event gede kecil sama aja reward
20
+ dikit, bukannya gak bersyukur...tapi lu nya aja yg pelit. tidak ramah untuk player
21
+ anak anak yang uang jajannya dikit, dikira anak anak pada kerja semua orang dewasa
22
+ yang kerja aja gaji gak sampe buat topup segitu, minimal beri reward yang lumayan
23
+ lah jangan kecil kecil mulu, dikira gacha itu murah... sekian terima kasih kikir
24
+ impact
25
+ - text: perubahan:jujur game nya bagus. grafik mantap. story lumayan. tapi developernya
26
+ kikir ama buta tuli terhadap komunitasnya. tidak ada perubahan dalam segi quality
27
+ of life dalam 3 tahun. ada beberapa qol yang di implementasi tapi kesanya tidak
28
+ berguna. ada masalah dengan game dan kita kritik dev jadi tuli bisu bahkan buta.
29
+ reward anniversary dan lantern rite juga sama selama 3 tahun. gak ada perubahan.
30
+ percuma ngasih survey kepuasan tiap akhir patch kalau cman buat formalitas.
31
+ - text: tulisan jaringan:tidak bisa login padahal jaringan bagus paket data juga masih
32
+ banyak, dan dilayar ada tulisan jaringan error, selama saya masih gabisa login
33
+ dan main saya bakal tetap kasih bintang 1
34
+ pipeline_tag: text-classification
35
+ inference: false
36
+ ---
37
+
38
+ # SetFit Aspect Model
39
+
40
+ This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Aspect Based Sentiment Analysis (ABSA). A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. In particular, this model is in charge of filtering aspect span candidates.
41
+
42
+ The model has been trained using an efficient few-shot learning technique that involves:
43
+
44
+ 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
45
+ 2. Training a classification head with features from the fine-tuned Sentence Transformer.
46
+
47
+ This model was trained within the context of a larger system for ABSA, which looks like so:
48
+
49
+ 1. Use a spaCy model to select possible aspect span candidates.
50
+ 2. **Use this SetFit model to filter these possible aspect span candidates.**
51
+ 3. Use a SetFit model to classify the filtered aspect span candidates.
52
+
53
+ ## Model Details
54
+
55
+ ### Model Description
56
+ - **Model Type:** SetFit
57
+ <!-- - **Sentence Transformer:** [Unknown](https://huggingface.co/unknown) -->
58
+ - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
59
+ - **spaCy Model:** id_core_news_trf
60
+ - **SetFitABSA Aspect Model:** [Funnyworld1412/ABSA_review_game_genshin-aspect](https://huggingface.co/Funnyworld1412/ABSA_review_game_genshin-aspect)
61
+ - **SetFitABSA Polarity Model:** [Funnyworld1412/ABSA_review_game_genshin-polarity](https://huggingface.co/Funnyworld1412/ABSA_review_game_genshin-polarity)
62
+ - **Maximum Sequence Length:** 512 tokens
63
+ - **Number of Classes:** 2 classes
64
+ <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
65
+ <!-- - **Language:** Unknown -->
66
+ <!-- - **License:** Unknown -->
67
+
68
+ ### Model Sources
69
+
70
+ - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
71
+ - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
72
+ - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
73
+
74
+ ### Model Labels
75
+ | Label | Examples |
76
+ |:----------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
77
+ | aspect | <ul><li>'story:saranku developer harus menciptakan sebuah story yang sangat menarik, agar tidak kehilangan para player karena masalahnya banyak player yg tidak bertahan lama karena repetitif dan monoton tiap update, size makin gede doang yg isinya cuma chest baru itupun sampah, puzzle yg makin lama makin rumit tapi chest nya sampah, story kebanyakan npc teyvat story utama punya mc dilupain gak difokusin , map kalo udah kosong ya nyampah bikin size gede doang. main 3 tahun rasanya monoton, perkembangan buruk'</li><li>'reward:tolong ditambah lagi reward untuk gachanya, untuk player lama kesulitan mendapatkan primo karena sudah tidak ada lagi quest dan eksplorasi juga sudah 100 . dasar developer kapitalis, game ini makin lama makin monoton dan tidak ramah untuk player lama yang kekurangan bahan untuk gacha karakter'</li><li>'event:cuman saran jangan terlalu pelit.. biar para player gak kabur sama game sebelah hadiah event quest di perbaiki.... udah nunggu event lama lama hadiah cuman gitu gitu aja... sampek event selesai primogemnya buat 10 pull gacha gak cukup.... tingakat kesulitan beda hadiah sama saja... lama lama yang main pada kabur kalok terlalu pelit.. dan 1 lagi jariang mohon di perbaiki untuk server indonya trimaksih'</li></ul> |
78
+ | no aspect | <ul><li>'saranku developer:saranku developer harus menciptakan sebuah story yang sangat menarik, agar tidak kehilangan para player karena masalahnya banyak player yg tidak bertahan lama karena repetitif dan monoton tiap update, size makin gede doang yg isinya cuma chest baru itupun sampah, puzzle yg makin lama makin rumit tapi chest nya sampah, story kebanyakan npc teyvat story utama punya mc dilupain gak difokusin , map kalo udah kosong ya nyampah bikin size gede doang. main 3 tahun rasanya monoton, perkembangan buruk'</li><li>'story:saranku developer harus menciptakan sebuah story yang sangat menarik, agar tidak kehilangan para player karena masalahnya banyak player yg tidak bertahan lama karena repetitif dan monoton tiap update, size makin gede doang yg isinya cuma chest baru itupun sampah, puzzle yg makin lama makin rumit tapi chest nya sampah, story kebanyakan npc teyvat story utama punya mc dilupain gak difokusin , map kalo udah kosong ya nyampah bikin size gede doang. main 3 tahun rasanya monoton, perkembangan buruk'</li><li>'player:saranku developer harus menciptakan sebuah story yang sangat menarik, agar tidak kehilangan para player karena masalahnya banyak player yg tidak bertahan lama karena repetitif dan monoton tiap update, size makin gede doang yg isinya cuma chest baru itupun sampah, puzzle yg makin lama makin rumit tapi chest nya sampah, story kebanyakan npc teyvat story utama punya mc dilupain gak difokusin , map kalo udah kosong ya nyampah bikin size gede doang. main 3 tahun rasanya monoton, perkembangan buruk'</li></ul> |
79
+
80
+ ## Uses
81
+
82
+ ### Direct Use for Inference
83
+
84
+ First install the SetFit library:
85
+
86
+ ```bash
87
+ pip install setfit
88
+ ```
89
+
90
+ Then you can load this model and run inference.
91
+
92
+ ```python
93
+ from setfit import AbsaModel
94
+
95
+ # Download from the 🤗 Hub
96
+ model = AbsaModel.from_pretrained(
97
+ "Funnyworld1412/ABSA_review_game_genshin-aspect",
98
+ "Funnyworld1412/ABSA_review_game_genshin-polarity",
99
+ )
100
+ # Run inference
101
+ preds = model("The food was great, but the venue is just way too busy.")
102
+ ```
103
+
104
+ <!--
105
+ ### Downstream Use
106
+
107
+ *List how someone could finetune this model on their own dataset.*
108
+ -->
109
+
110
+ <!--
111
+ ### Out-of-Scope Use
112
+
113
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
114
+ -->
115
+
116
+ <!--
117
+ ## Bias, Risks and Limitations
118
+
119
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
120
+ -->
121
+
122
+ <!--
123
+ ### Recommendations
124
+
125
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
126
+ -->
127
+
128
+ ## Training Details
129
+
130
+ ### Training Set Metrics
131
+ | Training set | Min | Median | Max |
132
+ |:-------------|:----|:--------|:----|
133
+ | Word count | 4 | 49.9079 | 94 |
134
+
135
+ | Label | Training Sample Count |
136
+ |:----------|:----------------------|
137
+ | no aspect | 2281 |
138
+ | aspect | 477 |
139
+
140
+ ### Training Hyperparameters
141
+ - batch_size: (4, 4)
142
+ - num_epochs: (1, 1)
143
+ - max_steps: -1
144
+ - sampling_strategy: oversampling
145
+ - num_iterations: 10
146
+ - body_learning_rate: (2e-05, 1e-05)
147
+ - head_learning_rate: 0.01
148
+ - loss: CosineSimilarityLoss
149
+ - distance_metric: cosine_distance
150
+ - margin: 0.25
151
+ - end_to_end: False
152
+ - use_amp: False
153
+ - warmup_proportion: 0.1
154
+ - seed: 42
155
+ - eval_max_steps: -1
156
+ - load_best_model_at_end: False
157
+
158
+ ### Training Results
159
+ | Epoch | Step | Training Loss | Validation Loss |
160
+ |:------:|:-----:|:-------------:|:---------------:|
161
+ | 0.0001 | 1 | 0.25 | - |
162
+ | 0.0036 | 50 | 0.331 | - |
163
+ | 0.0073 | 100 | 0.5002 | - |
164
+ | 0.0109 | 150 | 0.2904 | - |
165
+ | 0.0145 | 200 | 0.3791 | - |
166
+ | 0.0181 | 250 | 0.2253 | - |
167
+ | 0.0218 | 300 | 0.1909 | - |
168
+ | 0.0254 | 350 | 0.2504 | - |
169
+ | 0.0290 | 400 | 0.1241 | - |
170
+ | 0.0326 | 450 | 0.1021 | - |
171
+ | 0.0363 | 500 | 0.0985 | - |
172
+ | 0.0399 | 550 | 0.3831 | - |
173
+ | 0.0435 | 600 | 0.1841 | - |
174
+ | 0.0471 | 650 | 0.2487 | - |
175
+ | 0.0508 | 700 | 0.1573 | - |
176
+ | 0.0544 | 750 | 0.0499 | - |
177
+ | 0.0580 | 800 | 0.2214 | - |
178
+ | 0.0616 | 850 | 0.1427 | - |
179
+ | 0.0653 | 900 | 0.3544 | - |
180
+ | 0.0689 | 950 | 0.042 | - |
181
+ | 0.0725 | 1000 | 0.2918 | - |
182
+ | 0.0761 | 1050 | 0.0134 | - |
183
+ | 0.0798 | 1100 | 0.1933 | - |
184
+ | 0.0834 | 1150 | 0.0115 | - |
185
+ | 0.0870 | 1200 | 0.2393 | - |
186
+ | 0.0906 | 1250 | 0.2625 | - |
187
+ | 0.0943 | 1300 | 0.1496 | - |
188
+ | 0.0979 | 1350 | 0.1417 | - |
189
+ | 0.1015 | 1400 | 0.2111 | - |
190
+ | 0.1051 | 1450 | 0.2158 | - |
191
+ | 0.1088 | 1500 | 0.1378 | - |
192
+ | 0.1124 | 1550 | 0.0988 | - |
193
+ | 0.1160 | 1600 | 0.1183 | - |
194
+ | 0.1197 | 1650 | 0.324 | - |
195
+ | 0.1233 | 1700 | 0.3722 | - |
196
+ | 0.1269 | 1750 | 0.1696 | - |
197
+ | 0.1305 | 1800 | 0.2893 | - |
198
+ | 0.1342 | 1850 | 0.198 | - |
199
+ | 0.1378 | 1900 | 0.2854 | - |
200
+ | 0.1414 | 1950 | 0.3339 | - |
201
+ | 0.1450 | 2000 | 0.0783 | - |
202
+ | 0.1487 | 2050 | 0.014 | - |
203
+ | 0.1523 | 2100 | 0.0205 | - |
204
+ | 0.1559 | 2150 | 0.0151 | - |
205
+ | 0.1595 | 2200 | 0.3783 | - |
206
+ | 0.1632 | 2250 | 0.381 | - |
207
+ | 0.1668 | 2300 | 0.144 | - |
208
+ | 0.1704 | 2350 | 0.0023 | - |
209
+ | 0.1740 | 2400 | 0.1903 | - |
210
+ | 0.1777 | 2450 | 0.0033 | - |
211
+ | 0.1813 | 2500 | 0.0039 | - |
212
+ | 0.1849 | 2550 | 0.0019 | - |
213
+ | 0.1885 | 2600 | 0.0565 | - |
214
+ | 0.1922 | 2650 | 0.1551 | - |
215
+ | 0.1958 | 2700 | 0.0729 | - |
216
+ | 0.1994 | 2750 | 0.0272 | - |
217
+ | 0.2030 | 2800 | 0.495 | - |
218
+ | 0.2067 | 2850 | 0.0396 | - |
219
+ | 0.2103 | 2900 | 0.2288 | - |
220
+ | 0.2139 | 2950 | 0.0077 | - |
221
+ | 0.2175 | 3000 | 0.0642 | - |
222
+ | 0.2212 | 3050 | 0.0037 | - |
223
+ | 0.2248 | 3100 | 0.2447 | - |
224
+ | 0.2284 | 3150 | 0.0097 | - |
225
+ | 0.2321 | 3200 | 0.0011 | - |
226
+ | 0.2357 | 3250 | 0.1254 | - |
227
+ | 0.2393 | 3300 | 0.0046 | - |
228
+ | 0.2429 | 3350 | 0.0127 | - |
229
+ | 0.2466 | 3400 | 0.0093 | - |
230
+ | 0.2502 | 3450 | 0.0005 | - |
231
+ | 0.2538 | 3500 | 0.0022 | - |
232
+ | 0.2574 | 3550 | 0.0005 | - |
233
+ | 0.2611 | 3600 | 0.0002 | - |
234
+ | 0.2647 | 3650 | 0.0231 | - |
235
+ | 0.2683 | 3700 | 0.0016 | - |
236
+ | 0.2719 | 3750 | 0.1945 | - |
237
+ | 0.2756 | 3800 | 0.002 | - |
238
+ | 0.2792 | 3850 | 0.0235 | - |
239
+ | 0.2828 | 3900 | 0.006 | - |
240
+ | 0.2864 | 3950 | 0.0003 | - |
241
+ | 0.2901 | 4000 | 0.007 | - |
242
+ | 0.2937 | 4050 | 0.0227 | - |
243
+ | 0.2973 | 4100 | 0.1794 | - |
244
+ | 0.3009 | 4150 | 0.2629 | - |
245
+ | 0.3046 | 4200 | 0.3005 | - |
246
+ | 0.3082 | 4250 | 0.1974 | - |
247
+ | 0.3118 | 4300 | 0.001 | - |
248
+ | 0.3154 | 4350 | 0.0123 | - |
249
+ | 0.3191 | 4400 | 0.0027 | - |
250
+ | 0.3227 | 4450 | 0.0002 | - |
251
+ | 0.3263 | 4500 | 0.0005 | - |
252
+ | 0.3299 | 4550 | 0.0002 | - |
253
+ | 0.3336 | 4600 | 0.0007 | - |
254
+ | 0.3372 | 4650 | 0.0332 | - |
255
+ | 0.3408 | 4700 | 0.052 | - |
256
+ | 0.3445 | 4750 | 0.0103 | - |
257
+ | 0.3481 | 4800 | 0.0067 | - |
258
+ | 0.3517 | 4850 | 0.0003 | - |
259
+ | 0.3553 | 4900 | 0.0008 | - |
260
+ | 0.3590 | 4950 | 0.0088 | - |
261
+ | 0.3626 | 5000 | 0.0002 | - |
262
+ | 0.3662 | 5050 | 0.0111 | - |
263
+ | 0.3698 | 5100 | 0.0836 | - |
264
+ | 0.3735 | 5150 | 0.0001 | - |
265
+ | 0.3771 | 5200 | 0.2398 | - |
266
+ | 0.3807 | 5250 | 0.0002 | - |
267
+ | 0.3843 | 5300 | 0.1435 | - |
268
+ | 0.3880 | 5350 | 0.0001 | - |
269
+ | 0.3916 | 5400 | 0.0296 | - |
270
+ | 0.3952 | 5450 | 0.0003 | - |
271
+ | 0.3988 | 5500 | 0.1126 | - |
272
+ | 0.4025 | 5550 | 0.0009 | - |
273
+ | 0.4061 | 5600 | 0.0055 | - |
274
+ | 0.4097 | 5650 | 0.0031 | - |
275
+ | 0.4133 | 5700 | 0.1929 | - |
276
+ | 0.4170 | 5750 | 0.0002 | - |
277
+ | 0.4206 | 5800 | 0.2565 | - |
278
+ | 0.4242 | 5850 | 0.0002 | - |
279
+ | 0.4278 | 5900 | 0.0033 | - |
280
+ | 0.4315 | 5950 | 0.0011 | - |
281
+ | 0.4351 | 6000 | 0.0001 | - |
282
+ | 0.4387 | 6050 | 0.0004 | - |
283
+ | 0.4423 | 6100 | 0.0003 | - |
284
+ | 0.4460 | 6150 | 0.1076 | - |
285
+ | 0.4496 | 6200 | 0.0011 | - |
286
+ | 0.4532 | 6250 | 0.0034 | - |
287
+ | 0.4569 | 6300 | 0.0176 | - |
288
+ | 0.4605 | 6350 | 0.2883 | - |
289
+ | 0.4641 | 6400 | 0.0 | - |
290
+ | 0.4677 | 6450 | 0.0172 | - |
291
+ | 0.4714 | 6500 | 0.0014 | - |
292
+ | 0.4750 | 6550 | 0.0571 | - |
293
+ | 0.4786 | 6600 | 0.0287 | - |
294
+ | 0.4822 | 6650 | 0.1461 | - |
295
+ | 0.4859 | 6700 | 0.2333 | - |
296
+ | 0.4895 | 6750 | 0.1468 | - |
297
+ | 0.4931 | 6800 | 0.0005 | - |
298
+ | 0.4967 | 6850 | 0.0039 | - |
299
+ | 0.5004 | 6900 | 0.0004 | - |
300
+ | 0.5040 | 6950 | 0.0008 | - |
301
+ | 0.5076 | 7000 | 0.0004 | - |
302
+ | 0.5112 | 7050 | 0.0005 | - |
303
+ | 0.5149 | 7100 | 0.001 | - |
304
+ | 0.5185 | 7150 | 0.0041 | - |
305
+ | 0.5221 | 7200 | 0.0157 | - |
306
+ | 0.5257 | 7250 | 0.0228 | - |
307
+ | 0.5294 | 7300 | 0.0002 | - |
308
+ | 0.5330 | 7350 | 0.0004 | - |
309
+ | 0.5366 | 7400 | 0.0081 | - |
310
+ | 0.5402 | 7450 | 0.0004 | - |
311
+ | 0.5439 | 7500 | 0.1227 | - |
312
+ | 0.5475 | 7550 | 0.0001 | - |
313
+ | 0.5511 | 7600 | 0.0006 | - |
314
+ | 0.5547 | 7650 | 0.0003 | - |
315
+ | 0.5584 | 7700 | 0.0475 | - |
316
+ | 0.5620 | 7750 | 0.1848 | - |
317
+ | 0.5656 | 7800 | 0.0007 | - |
318
+ | 0.5693 | 7850 | 0.001 | - |
319
+ | 0.5729 | 7900 | 0.0002 | - |
320
+ | 0.5765 | 7950 | 0.0018 | - |
321
+ | 0.5801 | 8000 | 0.0009 | - |
322
+ | 0.5838 | 8050 | 0.0019 | - |
323
+ | 0.5874 | 8100 | 0.0001 | - |
324
+ | 0.5910 | 8150 | 0.0012 | - |
325
+ | 0.5946 | 8200 | 0.0536 | - |
326
+ | 0.5983 | 8250 | 0.0943 | - |
327
+ | 0.6019 | 8300 | 0.006 | - |
328
+ | 0.6055 | 8350 | 0.0019 | - |
329
+ | 0.6091 | 8400 | 0.0 | - |
330
+ | 0.6128 | 8450 | 0.0004 | - |
331
+ | 0.6164 | 8500 | 0.0 | - |
332
+ | 0.6200 | 8550 | 0.2588 | - |
333
+ | 0.6236 | 8600 | 0.0001 | - |
334
+ | 0.6273 | 8650 | 0.0084 | - |
335
+ | 0.6309 | 8700 | 0.0001 | - |
336
+ | 0.6345 | 8750 | 0.4123 | - |
337
+ | 0.6381 | 8800 | 0.073 | - |
338
+ | 0.6418 | 8850 | 0.0 | - |
339
+ | 0.6454 | 8900 | 0.1361 | - |
340
+ | 0.6490 | 8950 | 0.0249 | - |
341
+ | 0.6526 | 9000 | 0.0003 | - |
342
+ | 0.6563 | 9050 | 0.0018 | - |
343
+ | 0.6599 | 9100 | 0.0115 | - |
344
+ | 0.6635 | 9150 | 0.1789 | - |
345
+ | 0.6672 | 9200 | 0.0001 | - |
346
+ | 0.6708 | 9250 | 0.0006 | - |
347
+ | 0.6744 | 9300 | 0.002 | - |
348
+ | 0.6780 | 9350 | 0.0 | - |
349
+ | 0.6817 | 9400 | 0.0042 | - |
350
+ | 0.6853 | 9450 | 0.0003 | - |
351
+ | 0.6889 | 9500 | 0.0105 | - |
352
+ | 0.6925 | 9550 | 0.0 | - |
353
+ | 0.6962 | 9600 | 0.0285 | - |
354
+ | 0.6998 | 9650 | 0.0002 | - |
355
+ | 0.7034 | 9700 | 0.0 | - |
356
+ | 0.7070 | 9750 | 0.001 | - |
357
+ | 0.7107 | 9800 | 0.0641 | - |
358
+ | 0.7143 | 9850 | 0.0096 | - |
359
+ | 0.7179 | 9900 | 0.0001 | - |
360
+ | 0.7215 | 9950 | 0.0003 | - |
361
+ | 0.7252 | 10000 | 0.3666 | - |
362
+ | 0.7288 | 10050 | 0.0001 | - |
363
+ | 0.7324 | 10100 | 0.0001 | - |
364
+ | 0.7360 | 10150 | 0.0001 | - |
365
+ | 0.7397 | 10200 | 0.2526 | - |
366
+ | 0.7433 | 10250 | 0.0286 | - |
367
+ | 0.7469 | 10300 | 0.0001 | - |
368
+ | 0.7505 | 10350 | 0.004 | - |
369
+ | 0.7542 | 10400 | 0.0 | - |
370
+ | 0.7578 | 10450 | 0.0237 | - |
371
+ | 0.7614 | 10500 | 0.0012 | - |
372
+ | 0.7650 | 10550 | 0.0001 | - |
373
+ | 0.7687 | 10600 | 0.0223 | - |
374
+ | 0.7723 | 10650 | 0.0349 | - |
375
+ | 0.7759 | 10700 | 0.033 | - |
376
+ | 0.7796 | 10750 | 0.0005 | - |
377
+ | 0.7832 | 10800 | 0.0001 | - |
378
+ | 0.7868 | 10850 | 0.0001 | - |
379
+ | 0.7904 | 10900 | 0.0002 | - |
380
+ | 0.7941 | 10950 | 0.0005 | - |
381
+ | 0.7977 | 11000 | 0.0003 | - |
382
+ | 0.8013 | 11050 | 0.0 | - |
383
+ | 0.8049 | 11100 | 0.0348 | - |
384
+ | 0.8086 | 11150 | 0.0 | - |
385
+ | 0.8122 | 11200 | 0.0001 | - |
386
+ | 0.8158 | 11250 | 0.0 | - |
387
+ | 0.8194 | 11300 | 0.0 | - |
388
+ | 0.8231 | 11350 | 0.0 | - |
389
+ | 0.8267 | 11400 | 0.0002 | - |
390
+ | 0.8303 | 11450 | 0.0002 | - |
391
+ | 0.8339 | 11500 | 0.0112 | - |
392
+ | 0.8376 | 11550 | 0.0099 | - |
393
+ | 0.8412 | 11600 | 0.0 | - |
394
+ | 0.8448 | 11650 | 0.0 | - |
395
+ | 0.8484 | 11700 | 0.045 | - |
396
+ | 0.8521 | 11750 | 0.138 | - |
397
+ | 0.8557 | 11800 | 0.0283 | - |
398
+ | 0.8593 | 11850 | 0.0001 | - |
399
+ | 0.8629 | 11900 | 0.0 | - |
400
+ | 0.8666 | 11950 | 0.0751 | - |
401
+ | 0.8702 | 12000 | 0.0002 | - |
402
+ | 0.8738 | 12050 | 0.0 | - |
403
+ | 0.8774 | 12100 | 0.0001 | - |
404
+ | 0.8811 | 12150 | 0.0948 | - |
405
+ | 0.8847 | 12200 | 0.0896 | - |
406
+ | 0.8883 | 12250 | 0.1255 | - |
407
+ | 0.8920 | 12300 | 0.0001 | - |
408
+ | 0.8956 | 12350 | 0.0 | - |
409
+ | 0.8992 | 12400 | 0.1456 | - |
410
+ | 0.9028 | 12450 | 0.0079 | - |
411
+ | 0.9065 | 12500 | 0.0 | - |
412
+ | 0.9101 | 12550 | 0.0 | - |
413
+ | 0.9137 | 12600 | 0.0002 | - |
414
+ | 0.9173 | 12650 | 0.0047 | - |
415
+ | 0.9210 | 12700 | 0.1701 | - |
416
+ | 0.9246 | 12750 | 0.0423 | - |
417
+ | 0.9282 | 12800 | 0.0001 | - |
418
+ | 0.9318 | 12850 | 0.0969 | - |
419
+ | 0.9355 | 12900 | 0.0001 | - |
420
+ | 0.9391 | 12950 | 0.0 | - |
421
+ | 0.9427 | 13000 | 0.0 | - |
422
+ | 0.9463 | 13050 | 0.0301 | - |
423
+ | 0.9500 | 13100 | 0.0066 | - |
424
+ | 0.9536 | 13150 | 0.0 | - |
425
+ | 0.9572 | 13200 | 0.0 | - |
426
+ | 0.9608 | 13250 | 0.0 | - |
427
+ | 0.9645 | 13300 | 0.0 | - |
428
+ | 0.9681 | 13350 | 0.0008 | - |
429
+ | 0.9717 | 13400 | 0.0255 | - |
430
+ | 0.9753 | 13450 | 0.0 | - |
431
+ | 0.9790 | 13500 | 0.0908 | - |
432
+ | 0.9826 | 13550 | 0.0826 | - |
433
+ | 0.9862 | 13600 | 0.0 | - |
434
+ | 0.9898 | 13650 | 0.0247 | - |
435
+ | 0.9935 | 13700 | 0.0 | - |
436
+ | 0.9971 | 13750 | 0.0546 | - |
437
+
438
+ ### Framework Versions
439
+ - Python: 3.10.13
440
+ - SetFit: 1.0.3
441
+ - Sentence Transformers: 3.0.1
442
+ - spaCy: 3.7.5
443
+ - Transformers: 4.36.2
444
+ - PyTorch: 2.1.2
445
+ - Datasets: 2.19.2
446
+ - Tokenizers: 0.15.2
447
+
448
+ ## Citation
449
+
450
+ ### BibTeX
451
+ ```bibtex
452
+ @article{https://doi.org/10.48550/arxiv.2209.11055,
453
+ doi = {10.48550/ARXIV.2209.11055},
454
+ url = {https://arxiv.org/abs/2209.11055},
455
+ author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
456
+ keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
457
+ title = {Efficient Few-Shot Learning Without Prompts},
458
+ publisher = {arXiv},
459
+ year = {2022},
460
+ copyright = {Creative Commons Attribution 4.0 International}
461
+ }
462
+ ```
463
+
464
+ <!--
465
+ ## Glossary
466
+
467
+ *Clearly define terms in order to be accessible across audiences.*
468
+ -->
469
+
470
+ <!--
471
+ ## Model Card Authors
472
+
473
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
474
+ -->
475
+
476
+ <!--
477
+ ## Model Card Contact
478
+
479
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
480
+ -->
config.json ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "firqaaa/indo-setfit-absa-bert-base-restaurants-aspect",
3
+ "_num_labels": 5,
4
+ "architectures": [
5
+ "BertModel"
6
+ ],
7
+ "attention_probs_dropout_prob": 0.1,
8
+ "classifier_dropout": null,
9
+ "directionality": "bidi",
10
+ "hidden_act": "gelu",
11
+ "hidden_dropout_prob": 0.1,
12
+ "hidden_size": 768,
13
+ "id2label": {
14
+ "0": "LABEL_0",
15
+ "1": "LABEL_1",
16
+ "2": "LABEL_2",
17
+ "3": "LABEL_3",
18
+ "4": "LABEL_4"
19
+ },
20
+ "initializer_range": 0.02,
21
+ "intermediate_size": 3072,
22
+ "label2id": {
23
+ "LABEL_0": 0,
24
+ "LABEL_1": 1,
25
+ "LABEL_2": 2,
26
+ "LABEL_3": 3,
27
+ "LABEL_4": 4
28
+ },
29
+ "layer_norm_eps": 1e-12,
30
+ "max_position_embeddings": 512,
31
+ "model_type": "bert",
32
+ "num_attention_heads": 12,
33
+ "num_hidden_layers": 12,
34
+ "output_past": true,
35
+ "pad_token_id": 0,
36
+ "pooler_fc_size": 768,
37
+ "pooler_num_attention_heads": 12,
38
+ "pooler_num_fc_layers": 3,
39
+ "pooler_size_per_head": 128,
40
+ "pooler_type": "first_token_transform",
41
+ "position_embedding_type": "absolute",
42
+ "torch_dtype": "float32",
43
+ "transformers_version": "4.36.2",
44
+ "type_vocab_size": 2,
45
+ "use_cache": true,
46
+ "vocab_size": 50000
47
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.0.1",
4
+ "transformers": "4.36.2",
5
+ "pytorch": "2.1.2"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": null
10
+ }
config_setfit.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "span_context": 0,
3
+ "spacy_model": "id_core_news_trf",
4
+ "normalize_embeddings": false,
5
+ "labels": [
6
+ "no aspect",
7
+ "aspect"
8
+ ]
9
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c6bb1837e6080cca5acef88ed0429143ffbff96044d75f5964eb1da542fa4ef5
3
+ size 497787752
model_head.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aeb819e71496ccb7f165a97f331c5f3fec69a0c06e1bc10a5839e967ad2177c8
3
+ size 6991
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "4": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "mask_token": "[MASK]",
49
+ "max_length": 512,
50
+ "model_max_length": 512,
51
+ "never_split": null,
52
+ "pad_to_multiple_of": null,
53
+ "pad_token": "[PAD]",
54
+ "pad_token_type_id": 0,
55
+ "padding_side": "right",
56
+ "sep_token": "[SEP]",
57
+ "stride": 0,
58
+ "strip_accents": null,
59
+ "tokenize_chinese_chars": true,
60
+ "tokenizer_class": "BertTokenizer",
61
+ "truncation_side": "right",
62
+ "truncation_strategy": "longest_first",
63
+ "unk_token": "[UNK]"
64
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff