luis-cardoso-q's picture
Update README.md
ba22994 verified
metadata
library_name: setfit
tags:
  - setfit
  - sentence-transformers
  - text-classification
  - generated_from_setfit_trainer
base_model: sentence-transformers/paraphrase-multilingual-mpnet-base-v2
metrics:
  - accuracy
widget:
  - text: 'loan repayment '
  - text: 2023-F48
  - text: 'acompte '
  - text: 2023-12-1165548
  - text: Facture 20230040
pipeline_tag: text-classification
inference: true
model-index:
  - name: SetFit with sentence-transformers/paraphrase-multilingual-mpnet-base-v2
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: Unknown
          type: unknown
          split: test
        metrics:
          - type: accuracy
            value: 0.73568281938326
            name: Accuracy

SetFit with sentence-transformers/paraphrase-multilingual-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-multilingual-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Evaluation

Metrics

Label Accuracy
all 0.7357

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("luis-cardoso-q/kotodama-multilingual-v3")
# Run inference
preds = model("2023-F48")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 1 2.6689 16
Label Training Sample Count
buying 25
company name 73
invoice 128
random characters 128
refund 87
rent 38
salary 128

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: True

Training Results

Epoch Step Training Loss Validation Loss
0.0001 1 0.2604 -
0.0026 50 0.3244 -
0.0053 100 0.2233 -
0.0079 150 0.2034 -
0.0105 200 0.2998 -
0.0131 250 0.2074 -
0.0158 300 0.1682 -
0.0184 350 0.1815 -
0.0210 400 0.155 -
0.0237 450 0.16 -
0.0263 500 0.117 -
0.0289 550 0.1685 -
0.0315 600 0.0348 -
0.0342 650 0.0912 -
0.0368 700 0.0217 -
0.0394 750 0.0417 -
0.0421 800 0.0592 -
0.0447 850 0.047 -
0.0473 900 0.0914 -
0.0499 950 0.0116 -
0.0526 1000 0.022 -
0.0552 1050 0.0018 -
0.0578 1100 0.0159 -
0.0605 1150 0.0097 -
0.0631 1200 0.066 -
0.0657 1250 0.0027 -
0.0683 1300 0.003 -
0.0710 1350 0.0146 -
0.0736 1400 0.009 -
0.0762 1450 0.0016 -
0.0789 1500 0.001 -
0.0815 1550 0.019 -
0.0841 1600 0.0015 -
0.0867 1650 0.0003 -
0.0894 1700 0.0929 -
0.0920 1750 0.013 -
0.0946 1800 0.0007 -
0.0973 1850 0.0413 -
0.0999 1900 0.0922 -
0.1025 1950 0.0009 -
0.1051 2000 0.001 -
0.1078 2050 0.0007 -
0.1104 2100 0.0086 -
0.1130 2150 0.0017 -
0.1157 2200 0.0048 -
0.1183 2250 0.0002 -
0.1209 2300 0.0518 -
0.1235 2350 0.0271 -
0.1262 2400 0.0138 -
0.1288 2450 0.0136 -
0.1314 2500 0.0444 -
0.1341 2550 0.0096 -
0.1367 2600 0.0064 -
0.1393 2650 0.0092 -
0.1419 2700 0.0012 -
0.1446 2750 0.0044 -
0.1472 2800 0.0121 -
0.1498 2850 0.0004 -
0.1525 2900 0.0002 -
0.1551 2950 0.0008 -
0.1577 3000 0.0034 -
0.1603 3050 0.0002 -
0.1630 3100 0.0152 -
0.1656 3150 0.0195 -
0.1682 3200 0.0005 -
0.1709 3250 0.0002 -
0.1735 3300 0.0343 -
0.1761 3350 0.0095 -
0.1787 3400 0.0354 -
0.1814 3450 0.0085 -
0.1840 3500 0.001 -
0.1866 3550 0.0194 -
0.1893 3600 0.017 -
0.1919 3650 0.0003 -
0.1945 3700 0.0024 -
0.1972 3750 0.06 -
0.1998 3800 0.0006 -
0.2024 3850 0.0003 -
0.2050 3900 0.0311 -
0.2077 3950 0.023 -
0.2103 4000 0.0039 -
0.2129 4050 0.0085 -
0.2156 4100 0.0036 -
0.2182 4150 0.0015 -
0.2208 4200 0.0584 -
0.2234 4250 0.0004 -
0.2261 4300 0.0082 -
0.2287 4350 0.0001 -
0.2313 4400 0.0044 -
0.2340 4450 0.0003 -
0.2366 4500 0.0495 -
0.2392 4550 0.0073 -
0.2418 4600 0.0152 -
0.2445 4650 0.0033 -
0.2471 4700 0.0005 -
0.2497 4750 0.0102 -
0.2524 4800 0.046 -
0.2550 4850 0.0028 -
0.2576 4900 0.0014 -
0.2602 4950 0.0118 -
0.2629 5000 0.0042 -
0.2655 5050 0.0005 -
0.2681 5100 0.0031 -
0.2708 5150 0.0002 -
0.2734 5200 0.002 -
0.2760 5250 0.0111 -
0.2786 5300 0.0286 -
0.2813 5350 0.0009 -
0.2839 5400 0.0023 -
0.2865 5450 0.0079 -
0.2892 5500 0.0691 -
0.2918 5550 0.0403 -
0.2944 5600 0.0002 -
0.2970 5650 0.0057 -
0.2997 5700 0.0047 -
0.3023 5750 0.0322 -
0.3049 5800 0.0097 -
0.3076 5850 0.0012 -
0.3102 5900 0.0047 -
0.3128 5950 0.0925 -
0.3154 6000 0.0562 -
0.3181 6050 0.0058 -
0.3207 6100 0.0001 -
0.3233 6150 0.0029 -
0.3260 6200 0.0001 -
0.3286 6250 0.0035 -
0.3312 6300 0.0013 -
0.3338 6350 0.0152 -
0.3365 6400 0.0004 -
0.3391 6450 0.0114 -
0.3417 6500 0.0906 -
0.3444 6550 0.0005 -
0.3470 6600 0.0028 -
0.3496 6650 0.0395 -
0.3522 6700 0.0001 -
0.3549 6750 0.0044 -
0.3575 6800 0.0121 -
0.3601 6850 0.0012 -
0.3628 6900 0.0193 -
0.3654 6950 0.0014 -
0.3680 7000 0.0001 -
0.3706 7050 0.0618 -
0.3733 7100 0.0066 -
0.3759 7150 0.0426 -
0.3785 7200 0.0281 -
0.3812 7250 0.0254 -
0.3838 7300 0.0008 -
0.3864 7350 0.0047 -
0.3890 7400 0.0088 -
0.3917 7450 0.0004 -
0.3943 7500 0.0054 -
0.3969 7550 0.0371 -
0.3996 7600 0.0001 -
0.4022 7650 0.0082 -
0.4048 7700 0.0162 -
0.4074 7750 0.0093 -
0.4101 7800 0.0115 -
0.4127 7850 0.0114 -
0.4153 7900 0.0001 -
0.4180 7950 0.0002 -
0.4206 8000 0.0098 -
0.4232 8050 0.0001 -
0.4258 8100 0.0 -
0.4285 8150 0.0104 -
0.4311 8200 0.0564 -
0.4337 8250 0.0002 -
0.4364 8300 0.0176 -
0.4390 8350 0.0109 -
0.4416 8400 0.0001 -
0.4442 8450 0.0053 -
0.4469 8500 0.0629 -
0.4495 8550 0.0324 -
0.4521 8600 0.0003 -
0.4548 8650 0.0025 -
0.4574 8700 0.0032 -
0.4600 8750 0.0002 -
0.4626 8800 0.0001 -
0.4653 8850 0.0475 -
0.4679 8900 0.0114 -
0.4705 8950 0.0001 -
0.4732 9000 0.0028 -
0.4758 9050 0.0001 -
0.4784 9100 0.0002 -
0.4810 9150 0.0001 -
0.4837 9200 0.0001 -
0.4863 9250 0.0021 -
0.4889 9300 0.0001 -
0.4916 9350 0.0014 -
0.4942 9400 0.0176 -
0.4968 9450 0.0005 -
0.4994 9500 0.0001 -
0.5021 9550 0.0314 -
0.5047 9600 0.0613 -
0.5073 9650 0.018 -
0.5100 9700 0.0 -
0.5126 9750 0.0023 -
0.5152 9800 0.0013 -
0.5178 9850 0.0001 -
0.5205 9900 0.0003 -
0.5231 9950 0.001 -
0.5257 10000 0.0001 -
0.5284 10050 0.0193 -
0.5310 10100 0.0051 -
0.5336 10150 0.0001 -
0.5362 10200 0.0005 -
0.5389 10250 0.0 -
0.5415 10300 0.0001 -
0.5441 10350 0.0001 -
0.5468 10400 0.0037 -
0.5494 10450 0.0309 -
0.5520 10500 0.0286 -
0.5547 10550 0.0 -
0.5573 10600 0.0155 -
0.5599 10650 0.0001 -
0.5625 10700 0.0077 -
0.5652 10750 0.0153 -
0.5678 10800 0.0042 -
0.5704 10850 0.0103 -
0.5731 10900 0.0097 -
0.5757 10950 0.0109 -
0.5783 11000 0.0001 -
0.5809 11050 0.0103 -
0.5836 11100 0.0024 -
0.5862 11150 0.0001 -
0.5888 11200 0.0487 -
0.5915 11250 0.0009 -
0.5941 11300 0.0001 -
0.5967 11350 0.0002 -
0.5993 11400 0.0035 -
0.6020 11450 0.0005 -
0.6046 11500 0.0001 -
0.6072 11550 0.0049 -
0.6099 11600 0.0396 -
0.6125 11650 0.0177 -
0.6151 11700 0.0071 -
0.6177 11750 0.0071 -
0.6204 11800 0.0111 -
0.6230 11850 0.0145 -
0.6256 11900 0.037 -
0.6283 11950 0.0046 -
0.6309 12000 0.0258 -
0.6335 12050 0.0002 -
0.6361 12100 0.002 -
0.6388 12150 0.0119 -
0.6414 12200 0.0079 -
0.6440 12250 0.0239 -
0.6467 12300 0.0037 -
0.6493 12350 0.0366 -
0.6519 12400 0.0201 -
0.6545 12450 0.002 -
0.6572 12500 0.0652 -
0.6598 12550 0.005 -
0.6624 12600 0.0034 -
0.6651 12650 0.0003 -
0.6677 12700 0.0022 -
0.6703 12750 0.0001 -
0.6729 12800 0.0175 -
0.6756 12850 0.0003 -
0.6782 12900 0.0085 -
0.6808 12950 0.0036 -
0.6835 13000 0.0 -
0.6861 13050 0.0097 -
0.6887 13100 0.006 -
0.6913 13150 0.0001 -
0.6940 13200 0.0001 -
0.6966 13250 0.0379 -
0.6992 13300 0.0076 -
0.7019 13350 0.0627 -
0.7045 13400 0.0605 -
0.7071 13450 0.0081 -
0.7097 13500 0.0018 -
0.7124 13550 0.018 -
0.7150 13600 0.0035 -
0.7176 13650 0.0001 -
0.7203 13700 0.0001 -
0.7229 13750 0.0507 -
0.7255 13800 0.0082 -
0.7281 13850 0.0082 -
0.7308 13900 0.0106 -
0.7334 13950 0.0067 -
0.7360 14000 0.0062 -
0.7387 14050 0.0001 -
0.7413 14100 0.0246 -
0.7439 14150 0.0033 -
0.7465 14200 0.0001 -
0.7492 14250 0.0432 -
0.7518 14300 0.0502 -
0.7544 14350 0.0079 -
0.7571 14400 0.0291 -
0.7597 14450 0.0002 -
0.7623 14500 0.0029 -
0.7649 14550 0.0321 -
0.7676 14600 0.0002 -
0.7702 14650 0.0053 -
0.7728 14700 0.0094 -
0.7755 14750 0.0156 -
0.7781 14800 0.071 -
0.7807 14850 0.0001 -
0.7833 14900 0.0037 -
0.7860 14950 0.0544 -
0.7886 15000 0.0034 -
0.7912 15050 0.0018 -
0.7939 15100 0.0014 -
0.7965 15150 0.0189 -
0.7991 15200 0.0001 -
0.8017 15250 0.0057 -
0.8044 15300 0.0001 -
0.8070 15350 0.0002 -
0.8096 15400 0.0003 -
0.8123 15450 0.0006 -
0.8149 15500 0.1085 -
0.8175 15550 0.0003 -
0.8201 15600 0.0001 -
0.8228 15650 0.0005 -
0.8254 15700 0.014 -
0.8280 15750 0.0036 -
0.8307 15800 0.0001 -
0.8333 15850 0.0 -
0.8359 15900 0.0 -
0.8385 15950 0.0001 -
0.8412 16000 0.0001 -
0.8438 16050 0.0271 -
0.8464 16100 0.0093 -
0.8491 16150 0.0444 -
0.8517 16200 0.0002 -
0.8543 16250 0.0007 -
0.8569 16300 0.0002 -
0.8596 16350 0.0012 -
0.8622 16400 0.0 -
0.8648 16450 0.0177 -
0.8675 16500 0.0342 -
0.8701 16550 0.0288 -
0.8727 16600 0.0 -
0.8753 16650 0.0024 -
0.8780 16700 0.0003 -
0.8806 16750 0.0063 -
0.8832 16800 0.0442 -
0.8859 16850 0.0092 -
0.8885 16900 0.0089 -
0.8911 16950 0.0027 -
0.8937 17000 0.0521 -
0.8964 17050 0.0023 -
0.8990 17100 0.051 -
0.9016 17150 0.0015 -
0.9043 17200 0.0003 -
0.9069 17250 0.0177 -
0.9095 17300 0.0031 -
0.9121 17350 0.0205 -
0.9148 17400 0.0172 -
0.9174 17450 0.0001 -
0.9200 17500 0.005 -
0.9227 17550 0.0409 -
0.9253 17600 0.0001 -
0.9279 17650 0.0 -
0.9306 17700 0.0002 -
0.9332 17750 0.0274 -
0.9358 17800 0.0077 -
0.9384 17850 0.0078 -
0.9411 17900 0.0001 -
0.9437 17950 0.0 -
0.9463 18000 0.0437 -
0.9490 18050 0.0143 -
0.9516 18100 0.001 -
0.9542 18150 0.0001 -
0.9568 18200 0.0428 -
0.9595 18250 0.0036 -
0.9621 18300 0.0001 -
0.9647 18350 0.0001 -
0.9674 18400 0.0063 -
0.9700 18450 0.0 -
0.9726 18500 0.0196 -
0.9752 18550 0.0001 -
0.9779 18600 0.0001 -
0.9805 18650 0.0001 -
0.9831 18700 0.0397 -
0.9858 18750 0.008 -
0.9884 18800 0.015 -
0.9910 18850 0.0 -
0.9936 18900 0.003 -
0.9963 18950 0.025 -
0.9989 19000 0.003 -
1.0 19021 - 0.2343
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 2.4.0
  • Transformers: 4.38.1
  • PyTorch: 2.1.0+cu118
  • Datasets: 2.17.1
  • Tokenizers: 0.15.2

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}