Chernoffface commited on
Commit
02cc311
1 Parent(s): c093827

Push model using huggingface_hub.

Browse files
README.md CHANGED
@@ -1,493 +1,131 @@
1
- ---
2
- base_model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
3
- library_name: setfit
4
- metrics:
5
- - accuracy
6
- pipeline_tag: text-classification
7
- tags:
8
- - setfit
9
- - sentence-transformers
10
- - text-classification
11
- - generated_from_setfit_trainer
12
- widget:
13
- - text: How much should I invest in communication activities?
14
- - text: In addition, we will consider public reactions and reviews of these works.
15
- - text: Grundlagen der Fachdidaktik Pädagogik
16
- - text: >-
17
- Die Einzelthemen umfassen: * Hard- and Software-Architecture of Modern Game
18
- Systems * Time Management in Milliseconds * Asset Loading and Compression *
19
- Physically Based Realtime Rendering and Animations * Handling of Large Game
20
- Scenes * Audio Simulation and Mixing * Constraint-Based Physics Simulation *
21
- Artificial Intelligence for Games * Multiplayer-Networking * Procedural
22
- Content Creation * Integration of Scripting Languages * Optimization and
23
- parallelization of CPU and GPU Code Die Übungen enthalten Theorie- und
24
- Praxisanteile.
25
- - text: >-
26
- Wie entsteht überhaupt eine Ausstellung und in diesem Fall: eine, die
27
- weniger auf den Wert des Originals als die Kreativität ihrer Besucher setzt?
28
- inference: false
29
- language:
30
- - de
31
- - en
32
- ---
33
-
34
- # SetFit with sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
35
-
36
- This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) as the Sentence Transformer embedding model. A MultiOutputClassifier instance is used for classification.
37
-
38
- The model has been trained using an efficient few-shot learning technique that involves:
39
-
40
- 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
41
- 2. Training a classification head with features from the fine-tuned Sentence Transformer.
42
-
43
- ## Model Details
44
-
45
- ### Model Description
46
- - **Model Type:** SetFit
47
- - **Sentence Transformer body:** [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2)
48
- - **Classification head:** a MultiOutputClassifier instance
49
- - **Maximum Sequence Length:** 128 tokens
50
- - **Number of Classes:** 6 classes
51
- <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
52
- <!-- - **Language:** Unknown -->
53
- <!-- - **License:** Unknown -->
54
-
55
- ### Model Sources
56
-
57
- - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
58
- - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
59
- - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
60
-
61
- ## Uses
62
-
63
- ### Direct Use for Inference
64
-
65
- First install the SetFit library:
66
-
67
- ```bash
68
- pip install setfit
69
- ```
70
-
71
- Then you can load this model and run inference.
72
-
73
- ```python
74
- from setfit import SetFitModel
75
-
76
- # Download from the 🤗 Hub
77
- model = SetFitModel.from_pretrained("Chernoffface/fs-setfit-multilable-model")
78
- # Run inference
79
- preds = model("Grundlagen der Fachdidaktik Pädagogik")
80
- ```
81
-
82
- <!--
83
- ### Downstream Use
84
-
85
- *List how someone could finetune this model on their own dataset.*
86
- -->
87
-
88
- <!--
89
- ### Out-of-Scope Use
90
-
91
- *List how the model may foreseeably be misused and address what users ought not to do with the model.*
92
- -->
93
-
94
- <!--
95
- ## Bias, Risks and Limitations
96
-
97
- *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
98
- -->
99
-
100
- <!--
101
- ### Recommendations
102
-
103
- *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
104
- -->
105
-
106
- ## Training Details
107
-
108
- ### Training Set Metrics
109
- | Training set | Min | Median | Max |
110
- |:-------------|:----|:--------|:----|
111
- | Word count | 1 | 12.9119 | 131 |
112
-
113
- ### Training Hyperparameters
114
- - batch_size: (16, 16)
115
- - num_epochs: (2, 2)
116
- - max_steps: -1
117
- - sampling_strategy: oversampling
118
- - num_iterations: 40
119
- - body_learning_rate: (2e-05, 2e-05)
120
- - head_learning_rate: 2e-05
121
- - loss: CosineSimilarityLoss
122
- - distance_metric: cosine_distance
123
- - margin: 0.25
124
- - end_to_end: False
125
- - use_amp: False
126
- - warmup_proportion: 0.1
127
- - l2_weight: 0.01
128
- - seed: 42
129
- - eval_max_steps: -1
130
- - load_best_model_at_end: False
131
-
132
- ### Training Results
133
- | Epoch | Step | Training Loss | Validation Loss |
134
- |:------:|:-----:|:-------------:|:---------------:|
135
- | 0.0001 | 1 | 0.1571 | - |
136
- | 0.0063 | 50 | 0.1986 | - |
137
- | 0.0127 | 100 | 0.1774 | - |
138
- | 0.0190 | 150 | 0.136 | - |
139
- | 0.0254 | 200 | 0.1061 | - |
140
- | 0.0317 | 250 | 0.0779 | - |
141
- | 0.0380 | 300 | 0.0671 | - |
142
- | 0.0444 | 350 | 0.0482 | - |
143
- | 0.0507 | 400 | 0.0444 | - |
144
- | 0.0571 | 450 | 0.0427 | - |
145
- | 0.0634 | 500 | 0.0323 | - |
146
- | 0.0698 | 550 | 0.0274 | - |
147
- | 0.0761 | 600 | 0.0301 | - |
148
- | 0.0824 | 650 | 0.0259 | - |
149
- | 0.0888 | 700 | 0.0274 | - |
150
- | 0.0951 | 750 | 0.0305 | - |
151
- | 0.1015 | 800 | 0.0221 | - |
152
- | 0.1078 | 850 | 0.0185 | - |
153
- | 0.1141 | 900 | 0.0208 | - |
154
- | 0.1205 | 950 | 0.0198 | - |
155
- | 0.1268 | 1000 | 0.0107 | - |
156
- | 0.1332 | 1050 | 0.0149 | - |
157
- | 0.1395 | 1100 | 0.0162 | - |
158
- | 0.1458 | 1150 | 0.0119 | - |
159
- | 0.1522 | 1200 | 0.0162 | - |
160
- | 0.1585 | 1250 | 0.0133 | - |
161
- | 0.1649 | 1300 | 0.0177 | - |
162
- | 0.1712 | 1350 | 0.0102 | - |
163
- | 0.1776 | 1400 | 0.0224 | - |
164
- | 0.1839 | 1450 | 0.0107 | - |
165
- | 0.1902 | 1500 | 0.0182 | - |
166
- | 0.1966 | 1550 | 0.0137 | - |
167
- | 0.2029 | 1600 | 0.0158 | - |
168
- | 0.2093 | 1650 | 0.0142 | - |
169
- | 0.2156 | 1700 | 0.0117 | - |
170
- | 0.2219 | 1750 | 0.0161 | - |
171
- | 0.2283 | 1800 | 0.0128 | - |
172
- | 0.2346 | 1850 | 0.0118 | - |
173
- | 0.2410 | 1900 | 0.0125 | - |
174
- | 0.2473 | 1950 | 0.0135 | - |
175
- | 0.2536 | 2000 | 0.0123 | - |
176
- | 0.2600 | 2050 | 0.0128 | - |
177
- | 0.2663 | 2100 | 0.0119 | - |
178
- | 0.2727 | 2150 | 0.0074 | - |
179
- | 0.2790 | 2200 | 0.0116 | - |
180
- | 0.2854 | 2250 | 0.0088 | - |
181
- | 0.2917 | 2300 | 0.008 | - |
182
- | 0.2980 | 2350 | 0.0137 | - |
183
- | 0.3044 | 2400 | 0.0087 | - |
184
- | 0.3107 | 2450 | 0.0107 | - |
185
- | 0.3171 | 2500 | 0.0118 | - |
186
- | 0.3234 | 2550 | 0.0096 | - |
187
- | 0.3297 | 2600 | 0.0073 | - |
188
- | 0.3361 | 2650 | 0.0125 | - |
189
- | 0.3424 | 2700 | 0.0085 | - |
190
- | 0.3488 | 2750 | 0.0081 | - |
191
- | 0.3551 | 2800 | 0.0097 | - |
192
- | 0.3614 | 2850 | 0.0104 | - |
193
- | 0.3678 | 2900 | 0.0062 | - |
194
- | 0.3741 | 2950 | 0.0124 | - |
195
- | 0.3805 | 3000 | 0.0115 | - |
196
- | 0.3868 | 3050 | 0.012 | - |
197
- | 0.3932 | 3100 | 0.0147 | - |
198
- | 0.3995 | 3150 | 0.0097 | - |
199
- | 0.4058 | 3200 | 0.0107 | - |
200
- | 0.4122 | 3250 | 0.0074 | - |
201
- | 0.4185 | 3300 | 0.013 | - |
202
- | 0.4249 | 3350 | 0.0115 | - |
203
- | 0.4312 | 3400 | 0.008 | - |
204
- | 0.4375 | 3450 | 0.0087 | - |
205
- | 0.4439 | 3500 | 0.0099 | - |
206
- | 0.4502 | 3550 | 0.0076 | - |
207
- | 0.4566 | 3600 | 0.0118 | - |
208
- | 0.4629 | 3650 | 0.013 | - |
209
- | 0.4692 | 3700 | 0.0107 | - |
210
- | 0.4756 | 3750 | 0.0123 | - |
211
- | 0.4819 | 3800 | 0.0101 | - |
212
- | 0.4883 | 3850 | 0.0095 | - |
213
- | 0.4946 | 3900 | 0.01 | - |
214
- | 0.5010 | 3950 | 0.0068 | - |
215
- | 0.5073 | 4000 | 0.0064 | - |
216
- | 0.5136 | 4050 | 0.0096 | - |
217
- | 0.5200 | 4100 | 0.0063 | - |
218
- | 0.5263 | 4150 | 0.0083 | - |
219
- | 0.5327 | 4200 | 0.0067 | - |
220
- | 0.5390 | 4250 | 0.0095 | - |
221
- | 0.5453 | 4300 | 0.0097 | - |
222
- | 0.5517 | 4350 | 0.0057 | - |
223
- | 0.5580 | 4400 | 0.0101 | - |
224
- | 0.5644 | 4450 | 0.0101 | - |
225
- | 0.5707 | 4500 | 0.0043 | - |
226
- | 0.5770 | 4550 | 0.0099 | - |
227
- | 0.5834 | 4600 | 0.0091 | - |
228
- | 0.5897 | 4650 | 0.0065 | - |
229
- | 0.5961 | 4700 | 0.0071 | - |
230
- | 0.6024 | 4750 | 0.0035 | - |
231
- | 0.6088 | 4800 | 0.0088 | - |
232
- | 0.6151 | 4850 | 0.0079 | - |
233
- | 0.6214 | 4900 | 0.0094 | - |
234
- | 0.6278 | 4950 | 0.0105 | - |
235
- | 0.6341 | 5000 | 0.0091 | - |
236
- | 0.6405 | 5050 | 0.0109 | - |
237
- | 0.6468 | 5100 | 0.0081 | - |
238
- | 0.6531 | 5150 | 0.0087 | - |
239
- | 0.6595 | 5200 | 0.0091 | - |
240
- | 0.6658 | 5250 | 0.0071 | - |
241
- | 0.6722 | 5300 | 0.0072 | - |
242
- | 0.6785 | 5350 | 0.0084 | - |
243
- | 0.6848 | 5400 | 0.0099 | - |
244
- | 0.6912 | 5450 | 0.004 | - |
245
- | 0.6975 | 5500 | 0.0038 | - |
246
- | 0.7039 | 5550 | 0.0072 | - |
247
- | 0.7102 | 5600 | 0.0084 | - |
248
- | 0.7166 | 5650 | 0.004 | - |
249
- | 0.7229 | 5700 | 0.0077 | - |
250
- | 0.7292 | 5750 | 0.0066 | - |
251
- | 0.7356 | 5800 | 0.0043 | - |
252
- | 0.7419 | 5850 | 0.0054 | - |
253
- | 0.7483 | 5900 | 0.0107 | - |
254
- | 0.7546 | 5950 | 0.0046 | - |
255
- | 0.7609 | 6000 | 0.0075 | - |
256
- | 0.7673 | 6050 | 0.0106 | - |
257
- | 0.7736 | 6100 | 0.0063 | - |
258
- | 0.7800 | 6150 | 0.007 | - |
259
- | 0.7863 | 6200 | 0.0066 | - |
260
- | 0.7926 | 6250 | 0.0067 | - |
261
- | 0.7990 | 6300 | 0.0078 | - |
262
- | 0.8053 | 6350 | 0.0093 | - |
263
- | 0.8117 | 6400 | 0.0055 | - |
264
- | 0.8180 | 6450 | 0.0074 | - |
265
- | 0.8244 | 6500 | 0.0115 | - |
266
- | 0.8307 | 6550 | 0.0058 | - |
267
- | 0.8370 | 6600 | 0.005 | - |
268
- | 0.8434 | 6650 | 0.007 | - |
269
- | 0.8497 | 6700 | 0.0053 | - |
270
- | 0.8561 | 6750 | 0.0086 | - |
271
- | 0.8624 | 6800 | 0.0054 | - |
272
- | 0.8687 | 6850 | 0.0055 | - |
273
- | 0.8751 | 6900 | 0.006 | - |
274
- | 0.8814 | 6950 | 0.0068 | - |
275
- | 0.8878 | 7000 | 0.0103 | - |
276
- | 0.8941 | 7050 | 0.0054 | - |
277
- | 0.9004 | 7100 | 0.007 | - |
278
- | 0.9068 | 7150 | 0.0047 | - |
279
- | 0.9131 | 7200 | 0.0076 | - |
280
- | 0.9195 | 7250 | 0.0077 | - |
281
- | 0.9258 | 7300 | 0.0058 | - |
282
- | 0.9321 | 7350 | 0.0056 | - |
283
- | 0.9385 | 7400 | 0.0041 | - |
284
- | 0.9448 | 7450 | 0.0062 | - |
285
- | 0.9512 | 7500 | 0.0044 | - |
286
- | 0.9575 | 7550 | 0.0042 | - |
287
- | 0.9639 | 7600 | 0.0095 | - |
288
- | 0.9702 | 7650 | 0.0045 | - |
289
- | 0.9765 | 7700 | 0.0062 | - |
290
- | 0.9829 | 7750 | 0.0036 | - |
291
- | 0.9892 | 7800 | 0.0086 | - |
292
- | 0.9956 | 7850 | 0.0071 | - |
293
- | 1.0019 | 7900 | 0.0103 | - |
294
- | 1.0082 | 7950 | 0.004 | - |
295
- | 1.0146 | 8000 | 0.0059 | - |
296
- | 1.0209 | 8050 | 0.0053 | - |
297
- | 1.0273 | 8100 | 0.0079 | - |
298
- | 1.0336 | 8150 | 0.0078 | - |
299
- | 1.0399 | 8200 | 0.0077 | - |
300
- | 1.0463 | 8250 | 0.0062 | - |
301
- | 1.0526 | 8300 | 0.005 | - |
302
- | 1.0590 | 8350 | 0.0071 | - |
303
- | 1.0653 | 8400 | 0.0042 | - |
304
- | 1.0717 | 8450 | 0.0054 | - |
305
- | 1.0780 | 8500 | 0.0048 | - |
306
- | 1.0843 | 8550 | 0.0045 | - |
307
- | 1.0907 | 8600 | 0.0062 | - |
308
- | 1.0970 | 8650 | 0.0094 | - |
309
- | 1.1034 | 8700 | 0.0043 | - |
310
- | 1.1097 | 8750 | 0.004 | - |
311
- | 1.1160 | 8800 | 0.003 | - |
312
- | 1.1224 | 8850 | 0.0026 | - |
313
- | 1.1287 | 8900 | 0.0051 | - |
314
- | 1.1351 | 8950 | 0.0046 | - |
315
- | 1.1414 | 9000 | 0.0046 | - |
316
- | 1.1477 | 9050 | 0.0075 | - |
317
- | 1.1541 | 9100 | 0.0066 | - |
318
- | 1.1604 | 9150 | 0.0078 | - |
319
- | 1.1668 | 9200 | 0.0069 | - |
320
- | 1.1731 | 9250 | 0.0087 | - |
321
- | 1.1795 | 9300 | 0.0047 | - |
322
- | 1.1858 | 9350 | 0.0037 | - |
323
- | 1.1921 | 9400 | 0.007 | - |
324
- | 1.1985 | 9450 | 0.0069 | - |
325
- | 1.2048 | 9500 | 0.0061 | - |
326
- | 1.2112 | 9550 | 0.0047 | - |
327
- | 1.2175 | 9600 | 0.0065 | - |
328
- | 1.2238 | 9650 | 0.0058 | - |
329
- | 1.2302 | 9700 | 0.0061 | - |
330
- | 1.2365 | 9750 | 0.0055 | - |
331
- | 1.2429 | 9800 | 0.0064 | - |
332
- | 1.2492 | 9850 | 0.0041 | - |
333
- | 1.2555 | 9900 | 0.0086 | - |
334
- | 1.2619 | 9950 | 0.0053 | - |
335
- | 1.2682 | 10000 | 0.0047 | - |
336
- | 1.2746 | 10050 | 0.0053 | - |
337
- | 1.2809 | 10100 | 0.003 | - |
338
- | 1.2873 | 10150 | 0.0046 | - |
339
- | 1.2936 | 10200 | 0.0052 | - |
340
- | 1.2999 | 10250 | 0.0056 | - |
341
- | 1.3063 | 10300 | 0.0052 | - |
342
- | 1.3126 | 10350 | 0.0079 | - |
343
- | 1.3190 | 10400 | 0.006 | - |
344
- | 1.3253 | 10450 | 0.0055 | - |
345
- | 1.3316 | 10500 | 0.0066 | - |
346
- | 1.3380 | 10550 | 0.0076 | - |
347
- | 1.3443 | 10600 | 0.0037 | - |
348
- | 1.3507 | 10650 | 0.0066 | - |
349
- | 1.3570 | 10700 | 0.0059 | - |
350
- | 1.3633 | 10750 | 0.0057 | - |
351
- | 1.3697 | 10800 | 0.0038 | - |
352
- | 1.3760 | 10850 | 0.0044 | - |
353
- | 1.3824 | 10900 | 0.0059 | - |
354
- | 1.3887 | 10950 | 0.0073 | - |
355
- | 1.3951 | 11000 | 0.0055 | - |
356
- | 1.4014 | 11050 | 0.0039 | - |
357
- | 1.4077 | 11100 | 0.0054 | - |
358
- | 1.4141 | 11150 | 0.0068 | - |
359
- | 1.4204 | 11200 | 0.0067 | - |
360
- | 1.4268 | 11250 | 0.0041 | - |
361
- | 1.4331 | 11300 | 0.0076 | - |
362
- | 1.4394 | 11350 | 0.0071 | - |
363
- | 1.4458 | 11400 | 0.0044 | - |
364
- | 1.4521 | 11450 | 0.0061 | - |
365
- | 1.4585 | 11500 | 0.0039 | - |
366
- | 1.4648 | 11550 | 0.006 | - |
367
- | 1.4711 | 11600 | 0.0045 | - |
368
- | 1.4775 | 11650 | 0.0044 | - |
369
- | 1.4838 | 11700 | 0.0063 | - |
370
- | 1.4902 | 11750 | 0.0061 | - |
371
- | 1.4965 | 11800 | 0.0058 | - |
372
- | 1.5029 | 11850 | 0.0039 | - |
373
- | 1.5092 | 11900 | 0.0041 | - |
374
- | 1.5155 | 11950 | 0.0052 | - |
375
- | 1.5219 | 12000 | 0.0034 | - |
376
- | 1.5282 | 12050 | 0.0078 | - |
377
- | 1.5346 | 12100 | 0.0049 | - |
378
- | 1.5409 | 12150 | 0.0064 | - |
379
- | 1.5472 | 12200 | 0.0063 | - |
380
- | 1.5536 | 12250 | 0.0068 | - |
381
- | 1.5599 | 12300 | 0.008 | - |
382
- | 1.5663 | 12350 | 0.0043 | - |
383
- | 1.5726 | 12400 | 0.0057 | - |
384
- | 1.5789 | 12450 | 0.0044 | - |
385
- | 1.5853 | 12500 | 0.0048 | - |
386
- | 1.5916 | 12550 | 0.0049 | - |
387
- | 1.5980 | 12600 | 0.0052 | - |
388
- | 1.6043 | 12650 | 0.0061 | - |
389
- | 1.6107 | 12700 | 0.0066 | - |
390
- | 1.6170 | 12750 | 0.0079 | - |
391
- | 1.6233 | 12800 | 0.0047 | - |
392
- | 1.6297 | 12850 | 0.005 | - |
393
- | 1.6360 | 12900 | 0.0034 | - |
394
- | 1.6424 | 12950 | 0.0051 | - |
395
- | 1.6487 | 13000 | 0.006 | - |
396
- | 1.6550 | 13050 | 0.0046 | - |
397
- | 1.6614 | 13100 | 0.003 | - |
398
- | 1.6677 | 13150 | 0.0055 | - |
399
- | 1.6741 | 13200 | 0.0069 | - |
400
- | 1.6804 | 13250 | 0.0033 | - |
401
- | 1.6867 | 13300 | 0.0095 | - |
402
- | 1.6931 | 13350 | 0.0043 | - |
403
- | 1.6994 | 13400 | 0.0055 | - |
404
- | 1.7058 | 13450 | 0.0081 | - |
405
- | 1.7121 | 13500 | 0.0042 | - |
406
- | 1.7185 | 13550 | 0.0081 | - |
407
- | 1.7248 | 13600 | 0.0055 | - |
408
- | 1.7311 | 13650 | 0.0043 | - |
409
- | 1.7375 | 13700 | 0.0033 | - |
410
- | 1.7438 | 13750 | 0.0044 | - |
411
- | 1.7502 | 13800 | 0.0062 | - |
412
- | 1.7565 | 13850 | 0.0032 | - |
413
- | 1.7628 | 13900 | 0.0043 | - |
414
- | 1.7692 | 13950 | 0.0079 | - |
415
- | 1.7755 | 14000 | 0.0053 | - |
416
- | 1.7819 | 14050 | 0.0044 | - |
417
- | 1.7882 | 14100 | 0.0064 | - |
418
- | 1.7945 | 14150 | 0.0051 | - |
419
- | 1.8009 | 14200 | 0.0088 | - |
420
- | 1.8072 | 14250 | 0.0048 | - |
421
- | 1.8136 | 14300 | 0.0044 | - |
422
- | 1.8199 | 14350 | 0.0071 | - |
423
- | 1.8263 | 14400 | 0.0058 | - |
424
- | 1.8326 | 14450 | 0.007 | - |
425
- | 1.8389 | 14500 | 0.0028 | - |
426
- | 1.8453 | 14550 | 0.0046 | - |
427
- | 1.8516 | 14600 | 0.0061 | - |
428
- | 1.8580 | 14650 | 0.0054 | - |
429
- | 1.8643 | 14700 | 0.004 | - |
430
- | 1.8706 | 14750 | 0.0034 | - |
431
- | 1.8770 | 14800 | 0.0044 | - |
432
- | 1.8833 | 14850 | 0.0033 | - |
433
- | 1.8897 | 14900 | 0.007 | - |
434
- | 1.8960 | 14950 | 0.0044 | - |
435
- | 1.9023 | 15000 | 0.0045 | - |
436
- | 1.9087 | 15050 | 0.0045 | - |
437
- | 1.9150 | 15100 | 0.0093 | - |
438
- | 1.9214 | 15150 | 0.0036 | - |
439
- | 1.9277 | 15200 | 0.0055 | - |
440
- | 1.9341 | 15250 | 0.0037 | - |
441
- | 1.9404 | 15300 | 0.0043 | - |
442
- | 1.9467 | 15350 | 0.0034 | - |
443
- | 1.9531 | 15400 | 0.0068 | - |
444
- | 1.9594 | 15450 | 0.0058 | - |
445
- | 1.9658 | 15500 | 0.0069 | - |
446
- | 1.9721 | 15550 | 0.0081 | - |
447
- | 1.9784 | 15600 | 0.0061 | - |
448
- | 1.9848 | 15650 | 0.0039 | - |
449
- | 1.9911 | 15700 | 0.0065 | - |
450
- | 1.9975 | 15750 | 0.0048 | - |
451
-
452
- ### Framework Versions
453
- - Python: 3.12.3
454
- - SetFit: 1.1.0
455
- - Sentence Transformers: 3.2.0
456
- - Transformers: 4.45.2
457
- - PyTorch: 2.5.0+cu121
458
- - Datasets: 3.0.1
459
- - Tokenizers: 0.20.1
460
-
461
- ## Citation
462
-
463
- ### BibTeX
464
- ```bibtex
465
- @article{https://doi.org/10.48550/arxiv.2209.11055,
466
- doi = {10.48550/ARXIV.2209.11055},
467
- url = {https://arxiv.org/abs/2209.11055},
468
- author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
469
- keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
470
- title = {Efficient Few-Shot Learning Without Prompts},
471
- publisher = {arXiv},
472
- year = {2022},
473
- copyright = {Creative Commons Attribution 4.0 International}
474
- }
475
- ```
476
-
477
- <!--
478
- ## Glossary
479
-
480
- *Clearly define terms in order to be accessible across audiences.*
481
- -->
482
-
483
- <!--
484
- ## Model Card Authors
485
-
486
- *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
487
- -->
488
-
489
- <!--
490
- ## Model Card Contact
491
-
492
- *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
493
  -->
 
1
+ ---
2
+ base_model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
3
+ library_name: setfit
4
+ metrics:
5
+ - accuracy
6
+ pipeline_tag: text-classification
7
+ tags:
8
+ - setfit
9
+ - sentence-transformers
10
+ - text-classification
11
+ - generated_from_setfit_trainer
12
+ widget: []
13
+ inference: true
14
+ ---
15
+
16
+ # SetFit with sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
17
+
18
+ This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2) as the Sentence Transformer embedding model. A OneVsRestClassifier instance is used for classification.
19
+
20
+ The model has been trained using an efficient few-shot learning technique that involves:
21
+
22
+ 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
23
+ 2. Training a classification head with features from the fine-tuned Sentence Transformer.
24
+
25
+ ## Model Details
26
+
27
+ ### Model Description
28
+ - **Model Type:** SetFit
29
+ - **Sentence Transformer body:** [sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2)
30
+ - **Classification head:** a OneVsRestClassifier instance
31
+ - **Maximum Sequence Length:** 128 tokens
32
+ - **Number of Classes:** 6 classes
33
+ <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
34
+ <!-- - **Language:** Unknown -->
35
+ <!-- - **License:** Unknown -->
36
+
37
+ ### Model Sources
38
+
39
+ - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
40
+ - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
41
+ - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
42
+
43
+ ## Uses
44
+
45
+ ### Direct Use for Inference
46
+
47
+ First install the SetFit library:
48
+
49
+ ```bash
50
+ pip install setfit
51
+ ```
52
+
53
+ Then you can load this model and run inference.
54
+
55
+ ```python
56
+ from setfit import SetFitModel
57
+
58
+ # Download from the 🤗 Hub
59
+ model = SetFitModel.from_pretrained("Chernoffface/fs-setfit-multilable-model")
60
+ # Run inference
61
+ preds = model("I loved the spiderman movie!")
62
+ ```
63
+
64
+ <!--
65
+ ### Downstream Use
66
+
67
+ *List how someone could finetune this model on their own dataset.*
68
+ -->
69
+
70
+ <!--
71
+ ### Out-of-Scope Use
72
+
73
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
74
+ -->
75
+
76
+ <!--
77
+ ## Bias, Risks and Limitations
78
+
79
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
80
+ -->
81
+
82
+ <!--
83
+ ### Recommendations
84
+
85
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
86
+ -->
87
+
88
+ ## Training Details
89
+
90
+ ### Framework Versions
91
+ - Python: 3.12.7
92
+ - SetFit: 1.1.0
93
+ - Sentence Transformers: 3.2.1
94
+ - Transformers: 4.45.2
95
+ - PyTorch: 2.5.0+cu121
96
+ - Datasets: 2.19.1
97
+ - Tokenizers: 0.20.1
98
+
99
+ ## Citation
100
+
101
+ ### BibTeX
102
+ ```bibtex
103
+ @article{https://doi.org/10.48550/arxiv.2209.11055,
104
+ doi = {10.48550/ARXIV.2209.11055},
105
+ url = {https://arxiv.org/abs/2209.11055},
106
+ author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
107
+ keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
108
+ title = {Efficient Few-Shot Learning Without Prompts},
109
+ publisher = {arXiv},
110
+ year = {2022},
111
+ copyright = {Creative Commons Attribution 4.0 International}
112
+ }
113
+ ```
114
+
115
+ <!--
116
+ ## Glossary
117
+
118
+ *Clearly define terms in order to be accessible across audiences.*
119
+ -->
120
+
121
+ <!--
122
+ ## Model Card Authors
123
+
124
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
125
+ -->
126
+
127
+ <!--
128
+ ## Model Card Contact
129
+
130
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
131
  -->
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2",
3
  "architectures": [
4
  "BertModel"
5
  ],
 
1
  {
2
+ "_name_or_path": "models",
3
  "architectures": [
4
  "BertModel"
5
  ],
config_sentence_transformers.json CHANGED
@@ -1,6 +1,6 @@
1
  {
2
  "__version__": {
3
- "sentence_transformers": "3.2.0",
4
  "transformers": "4.45.2",
5
  "pytorch": "2.5.0+cu121"
6
  },
 
1
  {
2
  "__version__": {
3
+ "sentence_transformers": "3.2.1",
4
  "transformers": "4.45.2",
5
  "pytorch": "2.5.0+cu121"
6
  },
config_setfit.json CHANGED
@@ -1,11 +1,11 @@
1
  {
2
- "normalize_embeddings": false,
3
  "labels": [
4
- "Hardware-/Robotikentwicklung",
5
  "Softwareentwicklung",
6
  "Nutzerzentriertes Design",
7
- "Data Analytics & KI",
8
- "Quantencomputing",
9
- "IT-Architektur"
10
- ]
 
11
  }
 
1
  {
 
2
  "labels": [
3
+ "Data Analytics & KI",
4
  "Softwareentwicklung",
5
  "Nutzerzentriertes Design",
6
+ "IT-Architektur",
7
+ "Hardware/Robotikentwicklung",
8
+ "Quantencomputing"
9
+ ],
10
+ "normalize_embeddings": false
11
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:cb68e3a38d27f6bc16425cce11ae099d257e05424b0fc224747e41546e10608e
3
  size 470637416
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8ac69c669c2aa60b064c0826da2a527f2f2c54baa92a735a33e8011d66370392
3
  size 470637416
model_head.pkl CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:cd84bedbd040211b66cb2ea1c1b9c7031f697dc4f79f65021eddc6a6f2ed30f7
3
- size 21473
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:334239ff7247f8bdcaa00eb285691a4ea26515b6f5d700dc7413bd5aac434767
3
+ size 21460