SetFit with thenlper/gte-large
This is a SetFit model trained on the PolyAI/banking77 dataset that can be used for Text Classification. This SetFit model uses thenlper/gte-large as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
Model Description
- Model Type: SetFit
- Sentence Transformer body: thenlper/gte-large
- Classification head: a LogisticRegression instance
- Maximum Sequence Length: 512 tokens
- Number of Classes: 77 classes
- Training Dataset: PolyAI/banking77
Model Sources
- Repository: SetFit on GitHub
- Paper: Efficient Few-Shot Learning Without Prompts
- Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts
Model Labels
Label | Examples |
---|---|
11 |
|
13 |
|
32 |
|
17 |
|
34 |
|
46 |
|
36 |
|
12 |
|
4 |
|
14 |
|
33 |
|
41 |
|
1 |
|
49 |
|
23 |
|
56 |
|
47 |
|
8 |
|
60 |
|
75 |
|
15 |
|
66 |
|
54 |
|
40 |
|
10 |
|
61 |
|
6 |
|
16 |
|
30 |
|
74 |
|
68 |
|
38 |
|
73 |
|
62 |
|
29 |
|
22 |
|
3 |
|
28 |
|
44 |
|
26 |
|
45 |
|
42 |
|
52 |
|
27 |
|
51 |
|
25 |
|
48 |
|
55 |
|
18 |
|
63 |
|
70 |
|
67 |
|
53 |
|
21 |
|
7 |
|
64 |
|
50 |
|
35 |
|
65 |
|
71 |
|
39 |
|
58 |
|
43 |
|
72 |
|
76 |
|
37 |
|
59 |
|
5 |
|
20 |
|
31 |
|
57 |
|
0 |
|
19 |
|
9 |
|
2 |
|
69 |
|
24 |
|
Evaluation
Metrics
Label | Accuracy |
---|---|
all | 0.9286 |
Uses
Direct Use for Inference
First install the SetFit library:
pip install setfit
Then you can load this model and run inference.
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("HarshalBhg/gte-large-setfit-train-b77-test3")
# Run inference
preds = model("I made a transfer and am still waiting.")
Training Details
Training Set Metrics
Training set | Min | Median | Max |
---|---|---|---|
Word count | 2 | 12.0100 | 83 |
Label | Training Sample Count |
---|---|
0 | 159 |
1 | 110 |
2 | 126 |
3 | 87 |
4 | 127 |
5 | 171 |
6 | 181 |
7 | 156 |
8 | 157 |
9 | 129 |
10 | 59 |
11 | 153 |
12 | 112 |
13 | 139 |
14 | 112 |
15 | 187 |
16 | 168 |
17 | 167 |
18 | 61 |
19 | 177 |
20 | 160 |
21 | 122 |
22 | 86 |
23 | 35 |
24 | 129 |
25 | 153 |
26 | 173 |
27 | 133 |
28 | 182 |
29 | 121 |
30 | 121 |
31 | 121 |
32 | 112 |
33 | 118 |
34 | 166 |
35 | 137 |
36 | 126 |
37 | 97 |
38 | 106 |
39 | 129 |
40 | 98 |
41 | 82 |
42 | 121 |
43 | 120 |
44 | 105 |
45 | 159 |
46 | 143 |
47 | 149 |
48 | 148 |
49 | 115 |
50 | 95 |
51 | 162 |
52 | 169 |
53 | 161 |
54 | 129 |
55 | 108 |
56 | 111 |
57 | 114 |
58 | 114 |
59 | 145 |
60 | 97 |
61 | 146 |
62 | 103 |
63 | 175 |
64 | 172 |
65 | 113 |
66 | 171 |
67 | 128 |
68 | 102 |
69 | 104 |
70 | 113 |
71 | 126 |
72 | 41 |
73 | 135 |
74 | 121 |
75 | 180 |
76 | 163 |
Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 1e-05)
- head_learning_rate: 0.01
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
Training Results
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.0000 | 1 | 0.3308 | - |
0.0020 | 50 | 0.3355 | - |
0.0040 | 100 | 0.2973 | - |
0.0060 | 150 | 0.2882 | - |
0.0080 | 200 | 0.2114 | - |
0.0100 | 250 | 0.1366 | - |
0.0120 | 300 | 0.1641 | - |
0.0140 | 350 | 0.0941 | - |
0.0160 | 400 | 0.0899 | - |
0.0180 | 450 | 0.0607 | - |
0.0200 | 500 | 0.0672 | - |
0.0220 | 550 | 0.0454 | - |
0.0240 | 600 | 0.0983 | - |
0.0260 | 650 | 0.0877 | - |
0.0280 | 700 | 0.0937 | - |
0.0300 | 750 | 0.0625 | - |
0.0320 | 800 | 0.1028 | - |
0.0340 | 850 | 0.0244 | - |
0.0360 | 900 | 0.042 | - |
0.0380 | 950 | 0.0804 | - |
0.0400 | 1000 | 0.0163 | - |
0.0420 | 1050 | 0.0203 | - |
0.0440 | 1100 | 0.0835 | - |
0.0460 | 1150 | 0.0467 | - |
0.0480 | 1200 | 0.046 | - |
0.0500 | 1250 | 0.0275 | - |
0.0520 | 1300 | 0.0179 | - |
0.0540 | 1350 | 0.022 | - |
0.0560 | 1400 | 0.0617 | - |
0.0580 | 1450 | 0.0167 | - |
0.0600 | 1500 | 0.0182 | - |
0.0620 | 1550 | 0.0768 | - |
0.0640 | 1600 | 0.0807 | - |
0.0660 | 1650 | 0.0248 | - |
0.0680 | 1700 | 0.0283 | - |
0.0700 | 1750 | 0.0835 | - |
0.0720 | 1800 | 0.0073 | - |
0.0740 | 1850 | 0.0043 | - |
0.0760 | 1900 | 0.0569 | - |
0.0780 | 1950 | 0.062 | - |
0.0800 | 2000 | 0.0228 | - |
0.0820 | 2050 | 0.0493 | - |
0.0840 | 2100 | 0.0139 | - |
0.0860 | 2150 | 0.0524 | - |
0.0880 | 2200 | 0.0054 | - |
0.0900 | 2250 | 0.045 | - |
0.0920 | 2300 | 0.0304 | - |
0.0940 | 2350 | 0.0688 | - |
0.0960 | 2400 | 0.0372 | - |
0.0980 | 2450 | 0.0111 | - |
0.1000 | 2500 | 0.0068 | - |
0.1020 | 2550 | 0.0087 | - |
0.1040 | 2600 | 0.0032 | - |
0.1060 | 2650 | 0.0416 | - |
0.1080 | 2700 | 0.0172 | - |
0.1100 | 2750 | 0.0931 | - |
0.1120 | 2800 | 0.038 | - |
0.1140 | 2850 | 0.0342 | - |
0.1160 | 2900 | 0.0089 | - |
0.1180 | 2950 | 0.0064 | - |
0.1200 | 3000 | 0.0105 | - |
0.1220 | 3050 | 0.0026 | - |
0.1240 | 3100 | 0.0059 | - |
0.1260 | 3150 | 0.0028 | - |
0.1280 | 3200 | 0.0255 | - |
0.1300 | 3250 | 0.031 | - |
0.1320 | 3300 | 0.0017 | - |
0.1340 | 3350 | 0.0018 | - |
0.1360 | 3400 | 0.0033 | - |
0.1380 | 3450 | 0.0416 | - |
0.1400 | 3500 | 0.0283 | - |
0.1420 | 3550 | 0.0024 | - |
0.1440 | 3600 | 0.0029 | - |
0.1460 | 3650 | 0.0483 | - |
0.1480 | 3700 | 0.0057 | - |
0.1500 | 3750 | 0.0054 | - |
0.1520 | 3800 | 0.0254 | - |
0.1540 | 3850 | 0.0142 | - |
0.1560 | 3900 | 0.0448 | - |
0.1579 | 3950 | 0.0499 | - |
0.1599 | 4000 | 0.0021 | - |
0.1619 | 4050 | 0.0302 | - |
0.1639 | 4100 | 0.0115 | - |
0.1659 | 4150 | 0.0934 | - |
0.1679 | 4200 | 0.0083 | - |
0.1699 | 4250 | 0.002 | - |
0.1719 | 4300 | 0.0009 | - |
0.1739 | 4350 | 0.0015 | - |
0.1759 | 4400 | 0.007 | - |
0.1779 | 4450 | 0.0255 | - |
0.1799 | 4500 | 0.0057 | - |
0.1819 | 4550 | 0.0154 | - |
0.1839 | 4600 | 0.0018 | - |
0.1859 | 4650 | 0.0233 | - |
0.1879 | 4700 | 0.0368 | - |
0.1899 | 4750 | 0.001 | - |
0.1919 | 4800 | 0.0102 | - |
0.1939 | 4850 | 0.0051 | - |
0.1959 | 4900 | 0.0007 | - |
0.1979 | 4950 | 0.0176 | - |
0.1999 | 5000 | 0.0622 | - |
0.2019 | 5050 | 0.0161 | - |
0.2039 | 5100 | 0.0352 | - |
0.2059 | 5150 | 0.0614 | - |
0.2079 | 5200 | 0.0035 | - |
0.2099 | 5250 | 0.0045 | - |
0.2119 | 5300 | 0.0128 | - |
0.2139 | 5350 | 0.0012 | - |
0.2159 | 5400 | 0.0063 | - |
0.2179 | 5450 | 0.0602 | - |
0.2199 | 5500 | 0.0336 | - |
0.2219 | 5550 | 0.0018 | - |
0.2239 | 5600 | 0.0007 | - |
0.2259 | 5650 | 0.0142 | - |
0.2279 | 5700 | 0.001 | - |
0.2299 | 5750 | 0.0008 | - |
0.2319 | 5800 | 0.0018 | - |
0.2339 | 5850 | 0.0506 | - |
0.2359 | 5900 | 0.0026 | - |
0.2379 | 5950 | 0.0005 | - |
0.2399 | 6000 | 0.0014 | - |
0.2419 | 6050 | 0.0054 | - |
0.2439 | 6100 | 0.0297 | - |
0.2459 | 6150 | 0.0067 | - |
0.2479 | 6200 | 0.0331 | - |
0.2499 | 6250 | 0.0003 | - |
0.2519 | 6300 | 0.0068 | - |
0.2539 | 6350 | 0.0044 | - |
0.2559 | 6400 | 0.0124 | - |
0.2579 | 6450 | 0.0023 | - |
0.2599 | 6500 | 0.0007 | - |
0.2619 | 6550 | 0.0209 | - |
0.2639 | 6600 | 0.0009 | - |
0.2659 | 6650 | 0.0006 | - |
0.2679 | 6700 | 0.0018 | - |
0.2699 | 6750 | 0.0086 | - |
0.2719 | 6800 | 0.0005 | - |
0.2739 | 6850 | 0.0012 | - |
0.2759 | 6900 | 0.0081 | - |
0.2779 | 6950 | 0.0008 | - |
0.2799 | 7000 | 0.0013 | - |
0.2819 | 7050 | 0.0024 | - |
0.2839 | 7100 | 0.0024 | - |
0.2859 | 7150 | 0.0049 | - |
0.2879 | 7200 | 0.003 | - |
0.2899 | 7250 | 0.0015 | - |
0.2919 | 7300 | 0.0006 | - |
0.2939 | 7350 | 0.0568 | - |
0.2959 | 7400 | 0.0014 | - |
0.2979 | 7450 | 0.0017 | - |
0.2999 | 7500 | 0.0005 | - |
0.3019 | 7550 | 0.0056 | - |
0.3039 | 7600 | 0.0014 | - |
0.3059 | 7650 | 0.0013 | - |
0.3079 | 7700 | 0.0027 | - |
0.3099 | 7750 | 0.0027 | - |
0.3119 | 7800 | 0.0017 | - |
0.3139 | 7850 | 0.0308 | - |
0.3159 | 7900 | 0.0007 | - |
0.3179 | 7950 | 0.0026 | - |
0.3199 | 8000 | 0.0025 | - |
0.3219 | 8050 | 0.0005 | - |
0.3239 | 8100 | 0.0005 | - |
0.3259 | 8150 | 0.001 | - |
0.3279 | 8200 | 0.0049 | - |
0.3299 | 8250 | 0.0008 | - |
0.3319 | 8300 | 0.0019 | - |
0.3339 | 8350 | 0.0005 | - |
0.3359 | 8400 | 0.0022 | - |
0.3379 | 8450 | 0.001 | - |
0.3399 | 8500 | 0.0227 | - |
0.3419 | 8550 | 0.0006 | - |
0.3439 | 8600 | 0.0004 | - |
0.3459 | 8650 | 0.0002 | - |
0.3479 | 8700 | 0.0005 | - |
0.3499 | 8750 | 0.0009 | - |
0.3519 | 8800 | 0.001 | - |
0.3539 | 8850 | 0.0011 | - |
0.3559 | 8900 | 0.0011 | - |
0.3579 | 8950 | 0.0002 | - |
0.3599 | 9000 | 0.0845 | - |
0.3619 | 9050 | 0.002 | - |
0.3639 | 9100 | 0.003 | - |
0.3659 | 9150 | 0.0224 | - |
0.3679 | 9200 | 0.0023 | - |
0.3699 | 9250 | 0.0014 | - |
0.3719 | 9300 | 0.0018 | - |
0.3739 | 9350 | 0.0006 | - |
0.3759 | 9400 | 0.0015 | - |
0.3779 | 9450 | 0.0008 | - |
0.3799 | 9500 | 0.0019 | - |
0.3819 | 9550 | 0.0005 | - |
0.3839 | 9600 | 0.0474 | - |
0.3859 | 9650 | 0.0042 | - |
0.3879 | 9700 | 0.0032 | - |
0.3899 | 9750 | 0.0279 | - |
0.3919 | 9800 | 0.0011 | - |
0.3939 | 9850 | 0.003 | - |
0.3959 | 9900 | 0.0007 | - |
0.3979 | 9950 | 0.0016 | - |
0.3999 | 10000 | 0.0006 | - |
0.4019 | 10050 | 0.0011 | - |
0.4039 | 10100 | 0.0332 | - |
0.4059 | 10150 | 0.0006 | - |
0.4079 | 10200 | 0.0005 | - |
0.4099 | 10250 | 0.0009 | - |
0.4119 | 10300 | 0.0004 | - |
0.4139 | 10350 | 0.0006 | - |
0.4159 | 10400 | 0.0033 | - |
0.4179 | 10450 | 0.0011 | - |
0.4199 | 10500 | 0.0013 | - |
0.4219 | 10550 | 0.0004 | - |
0.4239 | 10600 | 0.0057 | - |
0.4259 | 10650 | 0.0038 | - |
0.4279 | 10700 | 0.0009 | - |
0.4299 | 10750 | 0.0018 | - |
0.4319 | 10800 | 0.0354 | - |
0.4339 | 10850 | 0.0007 | - |
0.4359 | 10900 | 0.0275 | - |
0.4379 | 10950 | 0.0007 | - |
0.4399 | 11000 | 0.0608 | - |
0.4419 | 11050 | 0.0008 | - |
0.4439 | 11100 | 0.0012 | - |
0.4459 | 11150 | 0.001 | - |
0.4479 | 11200 | 0.0029 | - |
0.4499 | 11250 | 0.0005 | - |
0.4519 | 11300 | 0.0003 | - |
0.4539 | 11350 | 0.0009 | - |
0.4559 | 11400 | 0.0002 | - |
0.4579 | 11450 | 0.0024 | - |
0.4599 | 11500 | 0.0022 | - |
0.4619 | 11550 | 0.0006 | - |
0.4639 | 11600 | 0.0018 | - |
0.4659 | 11650 | 0.0534 | - |
0.4679 | 11700 | 0.0005 | - |
0.4698 | 11750 | 0.0004 | - |
0.4718 | 11800 | 0.047 | - |
0.4738 | 11850 | 0.0021 | - |
0.4758 | 11900 | 0.0004 | - |
0.4778 | 11950 | 0.0006 | - |
0.4798 | 12000 | 0.0003 | - |
0.4818 | 12050 | 0.0049 | - |
0.4838 | 12100 | 0.0005 | - |
0.4858 | 12150 | 0.0003 | - |
0.4878 | 12200 | 0.0025 | - |
0.4898 | 12250 | 0.0011 | - |
0.4918 | 12300 | 0.0005 | - |
0.4938 | 12350 | 0.0064 | - |
0.4958 | 12400 | 0.0062 | - |
0.4978 | 12450 | 0.0046 | - |
0.4998 | 12500 | 0.0005 | - |
0.5018 | 12550 | 0.0003 | - |
0.5038 | 12600 | 0.0527 | - |
0.5058 | 12650 | 0.0013 | - |
0.5078 | 12700 | 0.0008 | - |
0.5098 | 12750 | 0.0003 | - |
0.5118 | 12800 | 0.0003 | - |
0.5138 | 12850 | 0.0004 | - |
0.5158 | 12900 | 0.0562 | - |
0.5178 | 12950 | 0.0003 | - |
0.5198 | 13000 | 0.0006 | - |
0.5218 | 13050 | 0.0009 | - |
0.5238 | 13100 | 0.0038 | - |
0.5258 | 13150 | 0.0006 | - |
0.5278 | 13200 | 0.0222 | - |
0.5298 | 13250 | 0.0003 | - |
0.5318 | 13300 | 0.0005 | - |
0.5338 | 13350 | 0.0003 | - |
0.5358 | 13400 | 0.0006 | - |
0.5378 | 13450 | 0.0003 | - |
0.5398 | 13500 | 0.0534 | - |
0.5418 | 13550 | 0.0005 | - |
0.5438 | 13600 | 0.001 | - |
0.5458 | 13650 | 0.0004 | - |
0.5478 | 13700 | 0.0008 | - |
0.5498 | 13750 | 0.0034 | - |
0.5518 | 13800 | 0.0018 | - |
0.5538 | 13850 | 0.0077 | - |
0.5558 | 13900 | 0.0003 | - |
0.5578 | 13950 | 0.0005 | - |
0.5598 | 14000 | 0.0012 | - |
0.5618 | 14050 | 0.0557 | - |
0.5638 | 14100 | 0.0015 | - |
0.5658 | 14150 | 0.0006 | - |
0.5678 | 14200 | 0.0005 | - |
0.5698 | 14250 | 0.0016 | - |
0.5718 | 14300 | 0.0007 | - |
0.5738 | 14350 | 0.0005 | - |
0.5758 | 14400 | 0.0006 | - |
0.5778 | 14450 | 0.0004 | - |
0.5798 | 14500 | 0.0021 | - |
0.5818 | 14550 | 0.0029 | - |
0.5838 | 14600 | 0.0025 | - |
0.5858 | 14650 | 0.0002 | - |
0.5878 | 14700 | 0.0164 | - |
0.5898 | 14750 | 0.0005 | - |
0.5918 | 14800 | 0.0026 | - |
0.5938 | 14850 | 0.0005 | - |
0.5958 | 14900 | 0.0003 | - |
0.5978 | 14950 | 0.0003 | - |
0.5998 | 15000 | 0.0003 | - |
0.6018 | 15050 | 0.0472 | - |
0.6038 | 15100 | 0.0004 | - |
0.6058 | 15150 | 0.0001 | - |
0.6078 | 15200 | 0.0005 | - |
0.6098 | 15250 | 0.0081 | - |
0.6118 | 15300 | 0.0561 | - |
0.6138 | 15350 | 0.0007 | - |
0.6158 | 15400 | 0.0028 | - |
0.6178 | 15450 | 0.0003 | - |
0.6198 | 15500 | 0.0006 | - |
0.6218 | 15550 | 0.0005 | - |
0.6238 | 15600 | 0.0003 | - |
0.6258 | 15650 | 0.0005 | - |
0.6278 | 15700 | 0.062 | - |
0.6298 | 15750 | 0.0002 | - |
0.6318 | 15800 | 0.0564 | - |
0.6338 | 15850 | 0.0576 | - |
0.6358 | 15900 | 0.0013 | - |
0.6378 | 15950 | 0.0026 | - |
0.6398 | 16000 | 0.0003 | - |
0.6418 | 16050 | 0.0013 | - |
0.6438 | 16100 | 0.0058 | - |
0.6458 | 16150 | 0.0554 | - |
0.6478 | 16200 | 0.0045 | - |
0.6498 | 16250 | 0.0011 | - |
0.6518 | 16300 | 0.0002 | - |
0.6538 | 16350 | 0.0063 | - |
0.6558 | 16400 | 0.0002 | - |
0.6578 | 16450 | 0.0006 | - |
0.6598 | 16500 | 0.0003 | - |
0.6618 | 16550 | 0.0003 | - |
0.6638 | 16600 | 0.0012 | - |
0.6658 | 16650 | 0.0003 | - |
0.6678 | 16700 | 0.0015 | - |
0.6698 | 16750 | 0.0004 | - |
0.6718 | 16800 | 0.0004 | - |
0.6738 | 16850 | 0.0005 | - |
0.6758 | 16900 | 0.0002 | - |
0.6778 | 16950 | 0.0004 | - |
0.6798 | 17000 | 0.0114 | - |
0.6818 | 17050 | 0.0004 | - |
0.6838 | 17100 | 0.0003 | - |
0.6858 | 17150 | 0.0007 | - |
0.6878 | 17200 | 0.0005 | - |
0.6898 | 17250 | 0.0022 | - |
0.6918 | 17300 | 0.0002 | - |
0.6938 | 17350 | 0.0002 | - |
0.6958 | 17400 | 0.0576 | - |
0.6978 | 17450 | 0.0002 | - |
0.6998 | 17500 | 0.0003 | - |
0.7018 | 17550 | 0.0005 | - |
0.7038 | 17600 | 0.0007 | - |
0.7058 | 17650 | 0.0002 | - |
0.7078 | 17700 | 0.0006 | - |
0.7098 | 17750 | 0.0003 | - |
0.7118 | 17800 | 0.0003 | - |
0.7138 | 17850 | 0.0002 | - |
0.7158 | 17900 | 0.0002 | - |
0.7178 | 17950 | 0.0263 | - |
0.7198 | 18000 | 0.0003 | - |
0.7218 | 18050 | 0.0003 | - |
0.7238 | 18100 | 0.0002 | - |
0.7258 | 18150 | 0.0007 | - |
0.7278 | 18200 | 0.0009 | - |
0.7298 | 18250 | 0.0002 | - |
0.7318 | 18300 | 0.0002 | - |
0.7338 | 18350 | 0.0004 | - |
0.7358 | 18400 | 0.0003 | - |
0.7378 | 18450 | 0.0002 | - |
0.7398 | 18500 | 0.0005 | - |
0.7418 | 18550 | 0.0002 | - |
0.7438 | 18600 | 0.0011 | - |
0.7458 | 18650 | 0.0005 | - |
0.7478 | 18700 | 0.0196 | - |
0.7498 | 18750 | 0.0003 | - |
0.7518 | 18800 | 0.0382 | - |
0.7538 | 18850 | 0.003 | - |
0.7558 | 18900 | 0.0003 | - |
0.7578 | 18950 | 0.0002 | - |
0.7598 | 19000 | 0.0002 | - |
0.7618 | 19050 | 0.0009 | - |
0.7638 | 19100 | 0.0002 | - |
0.7658 | 19150 | 0.0002 | - |
0.7678 | 19200 | 0.0002 | - |
0.7698 | 19250 | 0.0002 | - |
0.7718 | 19300 | 0.0002 | - |
0.7738 | 19350 | 0.0006 | - |
0.7758 | 19400 | 0.0003 | - |
0.7778 | 19450 | 0.0004 | - |
0.7798 | 19500 | 0.0002 | - |
0.7817 | 19550 | 0.0008 | - |
0.7837 | 19600 | 0.0002 | - |
0.7857 | 19650 | 0.0004 | - |
0.7877 | 19700 | 0.0002 | - |
0.7897 | 19750 | 0.0001 | - |
0.7917 | 19800 | 0.0027 | - |
0.7937 | 19850 | 0.0002 | - |
0.7957 | 19900 | 0.0006 | - |
0.7977 | 19950 | 0.0002 | - |
0.7997 | 20000 | 0.0003 | - |
0.8017 | 20050 | 0.1162 | - |
0.8037 | 20100 | 0.0056 | - |
0.8057 | 20150 | 0.0004 | - |
0.8077 | 20200 | 0.0021 | - |
0.8097 | 20250 | 0.0002 | - |
0.8117 | 20300 | 0.0002 | - |
0.8137 | 20350 | 0.0004 | - |
0.8157 | 20400 | 0.001 | - |
0.8177 | 20450 | 0.0005 | - |
0.8197 | 20500 | 0.0061 | - |
0.8217 | 20550 | 0.0002 | - |
0.8237 | 20600 | 0.0013 | - |
0.8257 | 20650 | 0.0007 | - |
0.8277 | 20700 | 0.0001 | - |
0.8297 | 20750 | 0.0006 | - |
0.8317 | 20800 | 0.0007 | - |
0.8337 | 20850 | 0.0007 | - |
0.8357 | 20900 | 0.0019 | - |
0.8377 | 20950 | 0.0001 | - |
0.8397 | 21000 | 0.0003 | - |
0.8417 | 21050 | 0.0002 | - |
0.8437 | 21100 | 0.0005 | - |
0.8457 | 21150 | 0.0001 | - |
0.8477 | 21200 | 0.0006 | - |
0.8497 | 21250 | 0.0002 | - |
0.8517 | 21300 | 0.0006 | - |
0.8537 | 21350 | 0.0008 | - |
0.8557 | 21400 | 0.0007 | - |
0.8577 | 21450 | 0.0007 | - |
0.8597 | 21500 | 0.0002 | - |
0.8617 | 21550 | 0.0002 | - |
0.8637 | 21600 | 0.0003 | - |
0.8657 | 21650 | 0.0003 | - |
0.8677 | 21700 | 0.0002 | - |
0.8697 | 21750 | 0.0002 | - |
0.8717 | 21800 | 0.0002 | - |
0.8737 | 21850 | 0.0015 | - |
0.8757 | 21900 | 0.0003 | - |
0.8777 | 21950 | 0.0013 | - |
0.8797 | 22000 | 0.0002 | - |
0.8817 | 22050 | 0.0004 | - |
0.8837 | 22100 | 0.0002 | - |
0.8857 | 22150 | 0.0097 | - |
0.8877 | 22200 | 0.0002 | - |
0.8897 | 22250 | 0.0003 | - |
0.8917 | 22300 | 0.0002 | - |
0.8937 | 22350 | 0.0002 | - |
0.8957 | 22400 | 0.0002 | - |
0.8977 | 22450 | 0.0002 | - |
0.8997 | 22500 | 0.0002 | - |
0.9017 | 22550 | 0.0002 | - |
0.9037 | 22600 | 0.0011 | - |
0.9057 | 22650 | 0.0003 | - |
0.9077 | 22700 | 0.0003 | - |
0.9097 | 22750 | 0.0004 | - |
0.9117 | 22800 | 0.0002 | - |
0.9137 | 22850 | 0.0001 | - |
0.9157 | 22900 | 0.0006 | - |
0.9177 | 22950 | 0.0002 | - |
0.9197 | 23000 | 0.0004 | - |
0.9217 | 23050 | 0.0002 | - |
0.9237 | 23100 | 0.0002 | - |
0.9257 | 23150 | 0.0003 | - |
0.9277 | 23200 | 0.0034 | - |
0.9297 | 23250 | 0.0003 | - |
0.9317 | 23300 | 0.0486 | - |
0.9337 | 23350 | 0.0015 | - |
0.9357 | 23400 | 0.0007 | - |
0.9377 | 23450 | 0.0002 | - |
0.9397 | 23500 | 0.0004 | - |
0.9417 | 23550 | 0.0003 | - |
0.9437 | 23600 | 0.0597 | - |
0.9457 | 23650 | 0.0559 | - |
0.9477 | 23700 | 0.0003 | - |
0.9497 | 23750 | 0.0009 | - |
0.9517 | 23800 | 0.0008 | - |
0.9537 | 23850 | 0.0001 | - |
0.9557 | 23900 | 0.0006 | - |
0.9577 | 23950 | 0.0002 | - |
0.9597 | 24000 | 0.0001 | - |
0.9617 | 24050 | 0.0003 | - |
0.9637 | 24100 | 0.0001 | - |
0.9657 | 24150 | 0.0002 | - |
0.9677 | 24200 | 0.0002 | - |
0.9697 | 24250 | 0.0003 | - |
0.9717 | 24300 | 0.0001 | - |
0.9737 | 24350 | 0.0001 | - |
0.9757 | 24400 | 0.0003 | - |
0.9777 | 24450 | 0.0005 | - |
0.9797 | 24500 | 0.0003 | - |
0.9817 | 24550 | 0.0026 | - |
0.9837 | 24600 | 0.0003 | - |
0.9857 | 24650 | 0.0001 | - |
0.9877 | 24700 | 0.0003 | - |
0.9897 | 24750 | 0.0003 | - |
0.9917 | 24800 | 0.0001 | - |
0.9937 | 24850 | 0.0002 | - |
0.9957 | 24900 | 0.0002 | - |
0.9977 | 24950 | 0.002 | - |
0.9997 | 25000 | 0.0002 | - |
Framework Versions
- Python: 3.10.12
- SetFit: 1.0.1
- Sentence Transformers: 2.2.2
- Transformers: 4.35.2
- PyTorch: 2.1.0+cu121
- Datasets: 2.15.0
- Tokenizers: 0.15.0
Citation
BibTeX
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
- Downloads last month
- 24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for HarshalBhg/gte-large-setfit-train-b77-test3
Base model
thenlper/gte-largeDataset used to train HarshalBhg/gte-large-setfit-train-b77-test3
Evaluation results
- Accuracy on PolyAI/banking77test set self-reported0.929