Edit model card

SetFit with sentence-transformers/paraphrase-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
79
  • 'peony middle notes'
  • 'lemon middle notes'
  • 'coconut middle notes'
86
  • 'no print/no pattern'
  • 'two tone'
  • 'diagonal stripe'
37
  • 'eel skin leather'
  • 'metal'
  • 'raffia'
82
  • 'collarless'
  • 'peaked lapel'
  • 'front keyhole'
95
  • 'standard toe'
  • 'wide toe'
  • 'extra wide toe'
83
  • 'indoor'
  • 'hike'
  • 'beach'
107
  • 'surplice'
  • 'messenger bag'
  • 'camera bag'
19
  • 'mary jane'
  • 'zip around wallet'
  • 'tongue buckle'
102
  • 'slits at knee'
  • 'slits above hips'
  • 'front slit at hem'
35
  • 'tie'
  • 'gem embellishment'
  • 'caged'
18
  • 'rolo chain'
  • 'cord bracelet'
  • 'figaro'
65
  • 'wheat protein'
  • 'rosemary ingredient'
  • 'pea protein'
68
  • 'bath towel'
  • 'art print'
  • 'reusable bottle'
40
  • 'polyfill'
  • 'silk fill'
  • 'feather fill'
50
  • 'palm grip'
  • 'carpenter hook'
  • 'storm flap'
113
  • 'wide waistband'
  • 'elastic inset'
  • 'belt loops'
75
  • 'glass'
  • 'acrylic'
  • 'opal'
11
  • 'foam cups'
  • 'wire'
  • 'molded cups'
38
  • 'dual layer fabric'
  • '2 way stretch'
  • '4 way stretch'
63
  • 'light support'
  • 'medium supprt'
  • 'high support'
44
  • 'face'
  • 'hand'
  • 'neck/dècolletage'
115
  • 'soy wax'
  • 'paraffin wax'
42
  • 'regular'
  • 'tailored'
  • 'fitted'
97
  • 'king'
  • 'euro'
  • 'standard'
70
  • 'wrist length'
  • 'above thigh'
  • 'below bust'
34
  • 'feminine'
  • 'religious'
  • 'boho'
10
  • 'slim'
  • 'regular'
15
  • '6-10 oz'
  • '11-20 oz'
77
  • 'rose gold metal'
  • 'gold plated'
  • 'alloy'
43
  • 'contrast inner lining'
  • 'simple seaming'
  • 'princess seams'
7
  • 'neroli base notes'
  • 'amber base notes'
  • 'musk base notes'
17
  • 'spot clean'
  • 'dry clean'
  • 'microwave safe'
8
  • 'nourishing'
  • 'firming'
  • 'soothing/healing'
103
  • 'lugged soles'
  • 'non marking soles'
26
  • 'wall control'
  • 'switch control'
99
  • 'fitted sleeves'
  • 'fitted sleeve'
  • 'structured sleeves'
33
  • 'rim'
  • 'feet'
  • '5 panel construction'
64
  • 'mineral oil free'
  • 'propylene glycol free'
  • 'paraffin free'
96
  • 'double strap'
  • 'spaghetti straps'
  • 'thin straps'
1
  • 'shoulder back'
  • 'full coverage'
  • 'low back'
62
  • 'rustic'
  • 'coastal'
  • 'scandinavian'
39
  • 'metallic'
  • 'swiss dot'
  • 'base layer'
60
  • 'halloween'
  • 'christmas holiday'
92
  • 'seamless'
  • 'mid rise waist seam'
  • 'flat seam'
114
  • 'ultra high rise'
  • 'mid rise'
  • 'high waisted'
105
  • 'top handle'
  • 'detachable straps'
  • 'chain strap'
90
  • 'floral'
  • 'psychedelic print'
  • 'paisley'
91
  • 'night'
  • 'day'
45
  • 'serum formulation'
  • 'cream/creme'
  • 'solid'
59
  • 'strong hold'
  • 'flexible hold'
46
  • 'leather'
  • 'fresh aquatic'
  • 'green aromatic'
21
  • 'matte'
  • 'metallic'
  • 'olive'
69
  • 'cinnamon key notes'
  • 'violet key notes'
  • 'pepper key notes'
101
  • 'dropped shoulder'
  • 'puff shoulder'
  • 'flutter sleeve'
61
  • 'summer'
  • 'everyday'
  • 'indoor'
104
  • 'wedding guest'
  • 'bridal'
  • 'halloween'
32
  • 'indigo wash'
  • 'acid wash'
  • 'stonewash'
51
  • 'still life graphic'
  • 'sports graphic'
  • 'star wars'
48
  • 'beige'
  • 'black'
  • 'rose gold frame'
87
  • 'medium pile'
  • 'low pile'
22
  • 'bright'
  • 'pastel'
  • 'light'
41
  • 'matte finish'
  • 'shiny finish'
93
  • 'no buckle'
  • 'geometric shape'
  • 'straight silhouette'
71
  • 'polarized'
  • 'color tinted'
  • 'mirrored'
2
  • 'split back'
  • 'racer back'
  • 'open back'
89
  • 'round stitch pocket'
  • 'seam pocket'
  • 'kangaroo pocket'
20
  • 'removable hoodie'
  • 'packable hood collar'
  • 'hooded'
52
  • 'thick'
  • 'medium thick'
55
  • 'amber head notes'
  • 'lime head notes'
  • 'musk head notes'
58
  • 'back curved hem'
  • 'twist hem'
  • 'ribbed hem'
118
  • 'light wood'
  • 'medium wood'
25
  • 'gifts for him'
  • 'apres ski'
  • 'cozy'
109
  • 'closed toe'
  • 'square toe'
  • 'round toe'
30
  • 'extended cuffs'
  • 'storm cuffs'
  • 'elastic cuff'
24
  • 'ingrown hairs'
  • 'frizz'
  • 'redness'
9
  • 'high cut'
  • 'string bikini'
94
  • 'opaque'
  • 'sheer'
16
  • '2 card slot'
  • 'card slots'
78
  • 'gothcore'
  • 'vanilla girl'
  • 'dyed out'
4
  • 'layered'
  • 'bangle'
  • 'cuff'
23
  • 'parfum'
  • 'eau de toilette'
111
  • 'delicate'
  • 'statement'
12
  • 'flat brim'
  • 'curved brim'
  • 'fold over brim'
98
  • 'dry'
  • 'acne prone'
  • 'mature'
57
  • 'stacked heel'
  • 'kitten heel'
  • 'cone heel'
67
  • 'id slot'
  • 'interior pocket'
  • 'interior zipper pocket'
31
  • 'light wash'
  • 'medium wash'
  • 'colored'
85
  • 'detailed stitching pant'
  • 'simple seaming'
116
  • 'knotted'
  • 'percale'
  • 'waffle weave'
88
  • 'shag'
  • 'cut pile'
74
  • 'study hall'
  • 'y2k'
  • 'enchanted'
72
  • 'fur'
  • 'fleece'
  • 'mesh'
108
  • 'animal'
  • 'love'
73
  • 'unlined'
  • 'fully lined'
  • 'partially lined'
13
  • 'wide brim'
  • 'medium brim'
76
  • 'bpa free material'
  • 'scratch resistant material'
54
  • 'straight handle'
  • 'curved handle'
100
  • 'rolled up sleeves'
  • '3/4 sleeve'
  • 'bracelet length'
84
  • 'manual open'
  • 'auto open'
14
  • 'wide'
  • 'medium'
27
  • 'superhero'
  • 'disney'
49
  • 'half rim'
  • 'full rim'
29
  • 'tall crown'
  • 'short crown'
106
  • 'low stretch'
  • 'non stretch'
112
  • 'mid vamp'
  • 'high vamp'
66
  • 'large interior'
  • 'medium interior'
  • 'small interior'
53
  • 'all hair types'
  • 'damaged/dry hair'
117
  • 'light weight'
  • 'mid weight'
81
  • 'low cut'
  • 'mid chest neckline'
  • 'open front'
5
  • 'thin band'
  • 'soft band elastic'
  • 'elastic band'
28
  • 'flat top crown'
  • 'round crown'
  • 'no crown'
56
  • 'ultra high heel'
  • 'mid heel'
  • 'high heel'
110
  • 'relaxed'
  • 'tailored'
47
  • 'uplifting'
  • 'bold'
3
  • 'changing pad'
  • 'bottle pocket'
0
  • 'squeeze dispenser'
  • 'dropper'
80
  • 'wall mount'
  • 'ceiling mount'
6
  • 'medium'
  • 'wide'
36
  • 'exterior pocket'
  • 'exterior snap pocket'

Evaluation

Metrics

Label Accuracy
all 0.5762

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("kaustubhgap/kaustubh_setfit")
# Run inference
preds = model("tube")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 1 1.7047 6
Label Training Sample Count
0 2
1 5
2 12
3 2
4 6
5 3
6 2
7 12
8 16
9 2
10 2
11 11
12 4
13 2
14 2
15 2
16 2
17 6
18 9
19 63
20 8
21 31
22 6
23 2
24 13
25 5
26 2
27 2
28 3
29 2
30 13
31 3
32 7
33 22
34 12
35 102
36 2
37 119
38 34
39 32
40 6
41 2
42 13
43 17
44 5
45 10
46 6
47 2
48 10
49 2
50 91
51 13
52 2
53 2
54 2
55 12
56 4
57 7
58 17
59 2
60 2
61 7
62 9
63 3
64 14
65 53
66 3
67 6
68 41
69 41
70 33
71 5
72 5
73 4
74 7
75 49
76 2
77 23
78 11
79 12
80 2
81 5
82 33
83 33
84 2
85 2
86 17
87 2
88 2
89 10
90 29
91 2
92 8
93 21
94 2
95 3
96 5
97 10
98 5
99 6
100 6
101 12
102 13
103 2
104 10
105 28
106 2
107 321
108 2
109 10
110 2
111 2
112 2
113 15
114 4
115 2
116 5
117 2
118 2

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (5, 5)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 20
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0002 1 0.2895 -
0.0112 50 0.2531 -
0.0225 100 0.2622 -
0.0337 150 0.2535 -
0.0449 200 0.2144 -
0.0561 250 0.206 -
0.0674 300 0.1583 -
0.0786 350 0.1384 -
0.0898 400 0.1778 -
0.1011 450 0.2111 -
0.1123 500 0.1791 -
0.1235 550 0.2198 -
0.1347 600 0.0918 -
0.1460 650 0.1027 -
0.1572 700 0.1837 -
0.1684 750 0.1762 -
0.1797 800 0.1552 -
0.1909 850 0.2045 -
0.2021 900 0.1338 -
0.2133 950 0.0495 -
0.2246 1000 0.1136 -
0.2358 1050 0.0878 -
0.2470 1100 0.1671 -
0.2583 1150 0.0791 -
0.2695 1200 0.1332 -
0.2807 1250 0.0712 -
0.2919 1300 0.1853 -
0.3032 1350 0.134 -
0.3144 1400 0.1123 -
0.3256 1450 0.0525 -
0.3369 1500 0.0901 -
0.3481 1550 0.1554 -
0.3593 1600 0.0417 -
0.3705 1650 0.0762 -
0.3818 1700 0.0155 -
0.3930 1750 0.0115 -
0.4042 1800 0.0665 -
0.4155 1850 0.0578 -
0.4267 1900 0.0271 -
0.4379 1950 0.1374 -
0.4491 2000 0.1125 -
0.4604 2050 0.0304 -
0.4716 2100 0.0636 -
0.4828 2150 0.0668 -
0.4940 2200 0.1055 -
0.5053 2250 0.1147 -
0.5165 2300 0.0358 -
0.5277 2350 0.1516 -
0.5390 2400 0.008 -
0.5502 2450 0.082 -
0.5614 2500 0.0937 -
0.5726 2550 0.1382 -
0.5839 2600 0.0527 -
0.5951 2650 0.1091 -
0.6063 2700 0.0031 -
0.6176 2750 0.0181 -
0.6288 2800 0.1366 -
0.6400 2850 0.0178 -
0.6512 2900 0.0571 -
0.6625 2950 0.0271 -
0.6737 3000 0.0368 -
0.6849 3050 0.0652 -
0.6962 3100 0.0858 -
0.7074 3150 0.016 -
0.7186 3200 0.0318 -
0.7298 3250 0.0119 -
0.7411 3300 0.0314 -
0.7523 3350 0.008 -
0.7635 3400 0.0192 -
0.7748 3450 0.0363 -
0.7860 3500 0.0474 -
0.7972 3550 0.0172 -
0.8084 3600 0.0308 -
0.8197 3650 0.1168 -
0.8309 3700 0.0367 -
0.8421 3750 0.1572 -
0.8534 3800 0.0865 -
0.8646 3850 0.0124 -
0.8758 3900 0.0674 -
0.8870 3950 0.0534 -
0.8983 4000 0.0042 -
0.9095 4050 0.0503 -
0.9207 4100 0.0753 -
0.9320 4150 0.0079 -
0.9432 4200 0.1386 -
0.9544 4250 0.0693 -
0.9656 4300 0.0505 -
0.9769 4350 0.0153 -
0.9881 4400 0.0456 -
0.9993 4450 0.077 -
1.0 4453 - 0.1885
1.0106 4500 0.0107 -
1.0218 4550 0.0533 -
1.0330 4600 0.0069 -
1.0442 4650 0.0073 -
1.0555 4700 0.0521 -
1.0667 4750 0.0084 -
1.0779 4800 0.0443 -
1.0892 4850 0.0504 -
1.1004 4900 0.0445 -
1.1116 4950 0.0169 -
1.1228 5000 0.016 -
1.1341 5050 0.0046 -
1.1453 5100 0.0103 -
1.1565 5150 0.0404 -
1.1678 5200 0.0117 -
1.1790 5250 0.0399 -
1.1902 5300 0.0598 -
1.2014 5350 0.015 -
1.2127 5400 0.0048 -
1.2239 5450 0.0047 -
1.2351 5500 0.0042 -
1.2464 5550 0.0106 -
1.2576 5600 0.0041 -
1.2688 5650 0.1593 -
1.2800 5700 0.0386 -
1.2913 5750 0.0059 -
1.3025 5800 0.0043 -
1.3137 5850 0.0039 -
1.3249 5900 0.0101 -
1.3362 5950 0.0043 -
1.3474 6000 0.0056 -
1.3586 6050 0.002 -
1.3699 6100 0.0064 -
1.3811 6150 0.0106 -
1.3923 6200 0.03 -
1.4035 6250 0.0945 -
1.4148 6300 0.0025 -
1.4260 6350 0.0631 -
1.4372 6400 0.0068 -
1.4485 6450 0.0583 -
1.4597 6500 0.0015 -
1.4709 6550 0.0042 -
1.4821 6600 0.0093 -
1.4934 6650 0.0046 -
1.5046 6700 0.009 -
1.5158 6750 0.0279 -
1.5271 6800 0.0357 -
1.5383 6850 0.0282 -
1.5495 6900 0.0188 -
1.5607 6950 0.0405 -
1.5720 7000 0.0645 -
1.5832 7050 0.0066 -
1.5944 7100 0.0205 -
1.6057 7150 0.0038 -
1.6169 7200 0.0696 -
1.6281 7250 0.0055 -
1.6393 7300 0.0034 -
1.6506 7350 0.006 -
1.6618 7400 0.015 -
1.6730 7450 0.0023 -
1.6843 7500 0.0173 -
1.6955 7550 0.0601 -
1.7067 7600 0.0039 -
1.7179 7650 0.0201 -
1.7292 7700 0.0206 -
1.7404 7750 0.0042 -
1.7516 7800 0.0156 -
1.7629 7850 0.002 -
1.7741 7900 0.0059 -
1.7853 7950 0.0327 -
1.7965 8000 0.0206 -
1.8078 8050 0.0698 -
1.8190 8100 0.0217 -
1.8302 8150 0.0309 -
1.8415 8200 0.0136 -
1.8527 8250 0.0455 -
1.8639 8300 0.0645 -
1.8751 8350 0.0127 -
1.8864 8400 0.0056 -
1.8976 8450 0.0127 -
1.9088 8500 0.0024 -
1.9201 8550 0.0117 -
1.9313 8600 0.0626 -
1.9425 8650 0.0357 -
1.9537 8700 0.056 -
1.9650 8750 0.0311 -
1.9762 8800 0.0123 -
1.9874 8850 0.0638 -
1.9987 8900 0.0328 -
2.0 8906 - 0.2196
2.0099 8950 0.0015 -
2.0211 9000 0.0178 -
2.0323 9050 0.08 -
2.0436 9100 0.0983 -
2.0548 9150 0.0049 -
2.0660 9200 0.0092 -
2.0773 9250 0.0619 -
2.0885 9300 0.0159 -
2.0997 9350 0.0598 -
2.1109 9400 0.0343 -
2.1222 9450 0.0092 -
2.1334 9500 0.0013 -
2.1446 9550 0.0042 -
2.1558 9600 0.0059 -
2.1671 9650 0.0076 -
2.1783 9700 0.0027 -
2.1895 9750 0.0174 -
2.2008 9800 0.0044 -
2.2120 9850 0.0164 -
2.2232 9900 0.0015 -
2.2344 9950 0.0026 -
2.2457 10000 0.0118 -
2.2569 10050 0.0054 -
2.2681 10100 0.0016 -
2.2794 10150 0.0095 -
2.2906 10200 0.0157 -
2.3018 10250 0.0465 -
2.3130 10300 0.0024 -
2.3243 10350 0.0009 -
2.3355 10400 0.0101 -
2.3467 10450 0.0266 -
2.3580 10500 0.0022 -
2.3692 10550 0.0016 -
2.3804 10600 0.0096 -
2.3916 10650 0.0052 -
2.4029 10700 0.0656 -
2.4141 10750 0.0481 -
2.4253 10800 0.0148 -
2.4366 10850 0.0024 -
2.4478 10900 0.0039 -
2.4590 10950 0.0011 -
2.4702 11000 0.0142 -
2.4815 11050 0.0617 -
2.4927 11100 0.0069 -
2.5039 11150 0.0063 -
2.5152 11200 0.0218 -
2.5264 11250 0.0018 -
2.5376 11300 0.0017 -
2.5488 11350 0.0105 -
2.5601 11400 0.0019 -
2.5713 11450 0.0027 -
2.5825 11500 0.0616 -
2.5938 11550 0.0704 -
2.6050 11600 0.0047 -
2.6162 11650 0.0106 -
2.6274 11700 0.0067 -
2.6387 11750 0.0272 -
2.6499 11800 0.0476 -
2.6611 11850 0.0401 -
2.6724 11900 0.0017 -
2.6836 11950 0.0247 -
2.6948 12000 0.0173 -
2.7060 12050 0.0129 -
2.7173 12100 0.0041 -
2.7285 12150 0.0017 -
2.7397 12200 0.0137 -
2.7510 12250 0.0629 -
2.7622 12300 0.034 -
2.7734 12350 0.0533 -
2.7846 12400 0.057 -
2.7959 12450 0.0153 -
2.8071 12500 0.0023 -
2.8183 12550 0.0013 -
2.8296 12600 0.0014 -
2.8408 12650 0.0023 -
2.8520 12700 0.0026 -
2.8632 12750 0.0027 -
2.8745 12800 0.0064 -
2.8857 12850 0.0174 -
2.8969 12900 0.0017 -
2.9082 12950 0.0242 -
2.9194 13000 0.0487 -
2.9306 13050 0.0022 -
2.9418 13100 0.0108 -
2.9531 13150 0.0079 -
2.9643 13200 0.0108 -
2.9755 13250 0.0027 -
2.9868 13300 0.0053 -
2.9980 13350 0.0039 -
3.0 13359 - 0.2038
3.0092 13400 0.0089 -
3.0204 13450 0.0369 -
3.0317 13500 0.0107 -
3.0429 13550 0.0187 -
3.0541 13600 0.0038 -
3.0653 13650 0.0072 -
3.0766 13700 0.005 -
3.0878 13750 0.0192 -
3.0990 13800 0.0084 -
3.1103 13850 0.002 -
3.1215 13900 0.0011 -
3.1327 13950 0.0037 -
3.1439 14000 0.0087 -
3.1552 14050 0.0014 -
3.1664 14100 0.0029 -
3.1776 14150 0.0176 -
3.1889 14200 0.0028 -
3.2001 14250 0.012 -
3.2113 14300 0.0933 -
3.2225 14350 0.002 -
3.2338 14400 0.053 -
3.2450 14450 0.0117 -
3.2562 14500 0.0227 -
3.2675 14550 0.0055 -
3.2787 14600 0.008 -
3.2899 14650 0.0512 -
3.3011 14700 0.0025 -
3.3124 14750 0.0432 -
3.3236 14800 0.002 -
3.3348 14850 0.013 -
3.3461 14900 0.0026 -
3.3573 14950 0.0022 -
3.3685 15000 0.0225 -
3.3797 15050 0.0611 -
3.3910 15100 0.0261 -
3.4022 15150 0.0026 -
3.4134 15200 0.004 -
3.4247 15250 0.0054 -
3.4359 15300 0.0132 -
3.4471 15350 0.0017 -
3.4583 15400 0.0213 -
3.4696 15450 0.007 -
3.4808 15500 0.0507 -
3.4920 15550 0.0039 -
3.5033 15600 0.0059 -
3.5145 15650 0.0357 -
3.5257 15700 0.0009 -
3.5369 15750 0.0014 -
3.5482 15800 0.0011 -
3.5594 15850 0.0082 -
3.5706 15900 0.001 -
3.5819 15950 0.0045 -
3.5931 16000 0.0205 -
3.6043 16050 0.0096 -
3.6155 16100 0.0286 -
3.6268 16150 0.0043 -
3.6380 16200 0.0029 -
3.6492 16250 0.0079 -
3.6605 16300 0.0036 -
3.6717 16350 0.0013 -
3.6829 16400 0.0086 -
3.6941 16450 0.0049 -
3.7054 16500 0.0006 -
3.7166 16550 0.0467 -
3.7278 16600 0.002 -
3.7391 16650 0.0229 -
3.7503 16700 0.0532 -
3.7615 16750 0.001 -
3.7727 16800 0.0034 -
3.7840 16850 0.0117 -
3.7952 16900 0.0424 -
3.8064 16950 0.0032 -
3.8177 17000 0.0024 -
3.8289 17050 0.0011 -
3.8401 17100 0.0024 -
3.8513 17150 0.0059 -
3.8626 17200 0.0005 -
3.8738 17250 0.0074 -
3.8850 17300 0.0517 -
3.8962 17350 0.0081 -
3.9075 17400 0.0131 -
3.9187 17450 0.051 -
3.9299 17500 0.0114 -
3.9412 17550 0.0008 -
3.9524 17600 0.0094 -
3.9636 17650 0.001 -
3.9748 17700 0.0069 -
3.9861 17750 0.002 -
3.9973 17800 0.003 -
4.0 17812 - 0.2278
4.0085 17850 0.0309 -
4.0198 17900 0.005 -
4.0310 17950 0.0028 -
4.0422 18000 0.0069 -
4.0534 18050 0.002 -
4.0647 18100 0.0384 -
4.0759 18150 0.0123 -
4.0871 18200 0.0657 -
4.0984 18250 0.0042 -
4.1096 18300 0.0043 -
4.1208 18350 0.0035 -
4.1320 18400 0.0389 -
4.1433 18450 0.0303 -
4.1545 18500 0.002 -
4.1657 18550 0.0009 -
4.1770 18600 0.0025 -
4.1882 18650 0.1035 -
4.1994 18700 0.0033 -
4.2106 18750 0.0038 -
4.2219 18800 0.0161 -
4.2331 18850 0.0415 -
4.2443 18900 0.003 -
4.2556 18950 0.0055 -
4.2668 19000 0.0064 -
4.2780 19050 0.0656 -
4.2892 19100 0.0011 -
4.3005 19150 0.0252 -
4.3117 19200 0.0076 -
4.3229 19250 0.0051 -
4.3342 19300 0.0042 -
4.3454 19350 0.0043 -
4.3566 19400 0.014 -
4.3678 19450 0.0047 -
4.3791 19500 0.0043 -
4.3903 19550 0.0014 -
4.4015 19600 0.0017 -
4.4128 19650 0.0811 -
4.4240 19700 0.0013 -
4.4352 19750 0.0332 -
4.4464 19800 0.0636 -
4.4577 19850 0.0068 -
4.4689 19900 0.0076 -
4.4801 19950 0.0217 -
4.4914 20000 0.0387 -
4.5026 20050 0.0077 -
4.5138 20100 0.0778 -
4.5250 20150 0.0523 -
4.5363 20200 0.0597 -
4.5475 20250 0.0092 -
4.5587 20300 0.0684 -
4.5700 20350 0.0151 -
4.5812 20400 0.0007 -
4.5924 20450 0.0018 -
4.6036 20500 0.0003 -
4.6149 20550 0.0051 -
4.6261 20600 0.0144 -
4.6373 20650 0.011 -
4.6486 20700 0.0061 -
4.6598 20750 0.0066 -
4.6710 20800 0.0046 -
4.6822 20850 0.0511 -
4.6935 20900 0.0198 -
4.7047 20950 0.001 -
4.7159 21000 0.0022 -
4.7272 21050 0.053 -
4.7384 21100 0.0025 -
4.7496 21150 0.034 -
4.7608 21200 0.0147 -
4.7721 21250 0.0684 -
4.7833 21300 0.0012 -
4.7945 21350 0.0029 -
4.8057 21400 0.0014 -
4.8170 21450 0.0522 -
4.8282 21500 0.0766 -
4.8394 21550 0.0031 -
4.8507 21600 0.0012 -
4.8619 21650 0.0011 -
4.8731 21700 0.0235 -
4.8843 21750 0.001 -
4.8956 21800 0.0178 -
4.9068 21850 0.0006 -
4.9180 21900 0.0092 -
4.9293 21950 0.025 -
4.9405 22000 0.017 -
4.9517 22050 0.0052 -
4.9629 22100 0.0437 -
4.9742 22150 0.0019 -
4.9854 22200 0.0039 -
4.9966 22250 0.0015 -
5.0 22265 - 0.2357

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 2.2.2
  • Transformers: 4.36.1
  • PyTorch: 2.0.1+cu118
  • Datasets: 2.15.0
  • Tokenizers: 0.15.0

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
9
Safetensors
Model size
109M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from

Evaluation results