metadata
base_model: sentence-transformers/all-mpnet-base-v2
library_name: setfit
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: >
I have alcoholism, drug abouse and suicide all over my family as far back
as three generations. After seeing several friend in college (class of
'76, U of Arkansas, Go Hogs!) get blind drunk and raped at frat parties,
I decided I could live without it. And I have -- even through five years
active duty in the army. I cook with wine and my husband likes a daily
beer in the summer. I haven't missed a thing, I'm height-weight
proportionate and probably a few pesos richer for not having squandered
money on booze. I live outside the US and I've seen dozens of women
battered beyond recognition by drunk husbands, children neglected by their
parents almost to a point of starvation, and families ruptured and ruined
by alcohol. It ain't worth it.
- text: >
The War Between the Catholic Cardinals Two essays make plain the different
views often obscured by careful political maneuvering within the church.
The death of the pope emeritus, Benedict XVI, was succeeded by a small
literary outpouring, a rush of publications that were interpreted as
salvos in the Catholic Church’s civil war. The list includes a memoir by
Benedict’s longtime secretary that mentioned the former pontiff’s
disappointment at his successor’s restriction of the Latin Mass, a
posthumous essay collection by Benedict himself that’s being mined for
controversial quotes, and an Associated Press interview with Pope Francis
that made news for its call to decriminalize homosexuality around the
world. Two essays make plain the different views often obscured by careful
political maneuvering within the church.
- text: >
"As one of the 100,000 or so Catholics in this country who attend the old
Mass each week, I will always be grateful to him for allowing for its
widespread celebration despite the promulgation of a new, vernacular
liturgy."This really says it all. "Soren Kierkegaard?"? Really? Mr.
Walther may be the editor of "The Lamp," but his lamp sheds no light on
Ratzinger or the fundamental evils of the continuous and painfully slow
downward spiral that has been the trajectory of the papacy and the Roman
Catholic Church for a very long time.Vatican II opened up a great hope.
Ratzinger and his followers saw in it only a threat to the cult of
secrecy, both of the sacraments, and the sins. They have done much to
unravel the all of the inherent good of Vatican II -- which actually made
Catholicism interesting and meaningful to youths at a time of great
cynicism in the world. Walther and his 100,000 should form their own 4th
century Catholic schism, "despite the promulgation of a new, vernacular
liturgy," and leave what's left of the Catholic church alone to re-build.
- text: >
Benedict, the reluctant popeThe former Cardinal Ratzinger had never wanted
to be pope, planning at age 78 to spend his final years writing in the
“peace and quiet” of his native Bavaria.Instead, he was forced to follow
the footsteps of the beloved St. John Paul II and run the church through
the fallout of the clerical sex abuse scandal Being elected pope, he once
said, felt like a “guillotine” had come down on him. Nevertheless, he set
about the job with a single-minded vision to rekindle the faith in a world
that, he frequently lamented, seemed to think it could do without God.“In
vast areas of the world today, there is a strange forgetfulness of God,”
he told one million young people gathered on a vast field for his first
foreign trip as pope, to World Youth Day in Cologne, Germany, in 2005. “It
seems as if everything would be just the same even without him.”With some
decisive, .. he tried to remind Europe of its Christian heritage. And he
set the Catholic Church on a conservative, tradition-minded path that
often alienated progressives. He relaxed the restrictions on celebrating
the old Latin Mass. It was a path that in many ways was reversed by his
successor, Francis, whose mercy-over-morals priorities alienated the
traditionalists Benedict’s style couldn’t have been more different from
that of Francis. No globe-trotting media darling or populist, Benedict was
a teacher, theologian and academic to the core: quiet and pensive with a
fierce mind. El Pais Dec
- text: >
Willy Stone Wouldn't it be a pity if all ancient art could only be seen
in the location where it was made?
inference: true
model-index:
- name: SetFit with sentence-transformers/all-mpnet-base-v2
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 1
name: Accuracy
SetFit with sentence-transformers/all-mpnet-base-v2
This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/all-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
Model Description
- Model Type: SetFit
- Sentence Transformer body: sentence-transformers/all-mpnet-base-v2
- Classification head: a LogisticRegression instance
- Maximum Sequence Length: 384 tokens
- Number of Classes: 2 classes
Model Sources
- Repository: SetFit on GitHub
- Paper: Efficient Few-Shot Learning Without Prompts
- Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts
Model Labels
Label | Examples |
---|---|
yes |
|
no |
|
Evaluation
Metrics
Label | Accuracy |
---|---|
all | 1.0 |
Uses
Direct Use for Inference
First install the SetFit library:
pip install setfit
Then you can load this model and run inference.
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("davidadamczyk/setfit-model-10")
# Run inference
preds = model("Willy Stone Wouldn't it be a pity if all ancient art could only be seen in the location where it was made?
")
Training Details
Training Set Metrics
Training set | Min | Median | Max |
---|---|---|---|
Word count | 15 | 123.625 | 286 |
Label | Training Sample Count |
---|---|
no | 18 |
yes | 22 |
Training Hyperparameters
- batch_size: (16, 16)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 120
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- l2_weight: 0.01
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
Training Results
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.0017 | 1 | 0.365 | - |
0.0833 | 50 | 0.1213 | - |
0.1667 | 100 | 0.0018 | - |
0.25 | 150 | 0.0004 | - |
0.3333 | 200 | 0.0002 | - |
0.4167 | 250 | 0.0002 | - |
0.5 | 300 | 0.0001 | - |
0.5833 | 350 | 0.0001 | - |
0.6667 | 400 | 0.0001 | - |
0.75 | 450 | 0.0001 | - |
0.8333 | 500 | 0.0001 | - |
0.9167 | 550 | 0.0001 | - |
1.0 | 600 | 0.0001 | - |
Framework Versions
- Python: 3.10.13
- SetFit: 1.1.0
- Sentence Transformers: 3.0.1
- Transformers: 4.45.2
- PyTorch: 2.4.0+cu124
- Datasets: 2.21.0
- Tokenizers: 0.20.0
Citation
BibTeX
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}