--- library_name: setfit tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer metrics: - accuracy widget: - text: As someone on the line between Millenial and GenZ, yeah. Bars are expensive and loud, and ubers home are expensive. It's a lot more reasonable to pool a bit of money, throw some food on a grill, and buy our own booze. We don't have the disposable income to hang out at bars regularly. - text: When we switch main focus from college football to college basketball, I can report back on Collier. But I'll be interested to see what the guys who really crunch tape on draft prospects say as these seasons progress. I know theres more than a few here in the sub. A huge 3 with skills would be fun to stack next to Wemby though. - text: The gen Z kids I see are more risk averse in general, because exposure to a lifetime on the internet has taught them that one mistake can ruin their lives. It always blows my mind when boomers and Xers like me wonder why kids have such high anxiety these days. It’s because they are regularly exposed to the judgement and horrors of the world around them. We were raised in a protective bubble mentally, in comparison - text: Well I guess I would expect this from a beer garden but I totally agree, those vibes don’t belong at Coachella - text: Can Earned the Brewery Pioneer (Level 6) badge! Earned the I Believe in IPA! (Level 5) badge! pipeline_tag: text-classification inference: true base_model: sentence-transformers/paraphrase-mpnet-base-v2 --- # SetFit with sentence-transformers/paraphrase-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 3 classes ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 1 |