SetFit with sentence-transformers/paraphrase-mpnet-base-v2
This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A SetFitHead instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
Model Description
Model Sources
Model Labels
Label |
Examples |
7 |
- "When I 've had a very bad and stressful day I can relax doing karate , because It 's the kind of sport that it is n't very hard ."
- "Also , you 'll meet friendly people who usually ask to you something to be friends and change your telephone number ."
- 'When I have spare time , I often gather my friends to watch basketball match on television .'
|
4 |
- "stop shouting . do n't shout ."
- 'Yours Sincerely .'
- 'Something that they don know was that the whole thing was a movie !'
|
1 |
- 'She stay sleeping in the bed and doing nothing all day .'
- 'People collects trash of their house and await the trash truck that carried the trash to a landfill located outside the village .'
- "Travelling by car is n't so much more convenient unless it is so much more comfortable , but actually we do n't think about the contamination in our planet ."
|
6 |
- 'When the concert finished , we went to cloakroom to get signatures from musicians .'
- 'We have solar panels and a place to make compost at the last garden , with worms who eat and degrade all the organic waste of the school .'
- 'The aim of this report is to give you my personal point of view of the course I did in your branch in Madrid last month .'
|
5 |
- 'You can also bought a lot of gifts like key chains , statue , or what else memories to be made before returning to Malaysia .'
- 'I always said that I passed that test and I was sure of that .'
- 'In addition , to decrease the risk of negative comments or posts , Facebook and Twitter would improve their futures to solve the less personal privacy problem .'
|
2 |
- 'They were not only really clever people but also excellent co - workers .'
- 'On balance , learning foreign languages is very positive on different aspect , so if you have the positivity of learning a new language do it , because it will bring you many benefits .'
- 'In many years of work I have honed my skills in managing non - standard situations , analyzing the problem , finding and implementing practical and easy solutions .'
|
0 |
- 'It is very funny .'
- 'In China , English is took to be a foreign language which many students choose to learn .'
- 'We also value that they have specialised studies in Cloud technology , and hosting management .'
|
3 |
- "Usually there are generation problems , sons do n't understand parents and vicecersa , but dialoging and listening emotions and facts , everyone can have another point of view ."
- 'the two boys heard that he was planing to steal some money and kill people so the boys start their adventure on stoping Injuin Joe ...'
- 'As an example , if you are able to get alone with your travel companion could enjoy each moment of the trip , exchange some pictures , eat together , and visit places with common interest such as museums or malls .'
|
Evaluation
Metrics
Label |
Accuracy |
all |
0.1315 |
Uses
Direct Use for Inference
First install the SetFit library:
pip install setfit
Then you can load this model and run inference.
from setfit import SetFitModel
model = SetFitModel.from_pretrained("HelgeKn/BEA2019-multi-class-4")
preds = model("Had 12 years old .")
Training Details
Training Set Metrics
Training set |
Min |
Median |
Max |
Word count |
3 |
19.1562 |
42 |
Label |
Training Sample Count |
0 |
4 |
1 |
4 |
2 |
4 |
3 |
4 |
4 |
4 |
5 |
4 |
6 |
4 |
7 |
4 |
Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (2, 2)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
Training Results
Epoch |
Step |
Training Loss |
Validation Loss |
0.0125 |
1 |
0.1886 |
- |
0.625 |
50 |
0.0778 |
- |
1.25 |
100 |
0.0194 |
- |
1.875 |
150 |
0.0069 |
- |
Framework Versions
- Python: 3.9.13
- SetFit: 1.0.1
- Sentence Transformers: 2.2.2
- Transformers: 4.36.0
- PyTorch: 2.1.1+cpu
- Datasets: 2.15.0
- Tokenizers: 0.15.0
Citation
BibTeX
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}