konsman's picture
Add SetFit model
c1493c6 verified
metadata
library_name: setfit
tags:
  - setfit
  - sentence-transformers
  - text-classification
  - generated_from_setfit_trainer
datasets:
  - konsman/setfit-messages-updated-influence-level
metrics:
  - accuracy
widget:
  - text: >-
      The influence level of Staying hydrated is especially important for older
      adults to prevent dehydration.
  - text: >-
      The influence level of Regularly updating emergency contact information is
      important for the elderly.
  - text: >-
      The influence level of Early detection saves lives. Support breast cancer
      awareness month.  Wear pink, spread awareness. Stand with us this breast
      cancer awareness month. 
  - text: >-
      The influence level of Mental Health Day is approaching. Join our online
      discussion on well-being.  Prioritize mental health. Participate in our
      online discussion this Mental Health Day. 
  - text: >-
      The influence level of Regular kidney function tests are important for
      those with high blood pressure.
pipeline_tag: text-classification
inference: true
base_model: sentence-transformers/all-mpnet-base-v2
model-index:
  - name: SetFit with sentence-transformers/all-mpnet-base-v2
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: konsman/setfit-messages-updated-influence-level
          type: konsman/setfit-messages-updated-influence-level
          split: test
        metrics:
          - type: accuracy
            value: 0.47368421052631576
            name: Accuracy

SetFit with sentence-transformers/all-mpnet-base-v2

This is a SetFit model trained on the konsman/setfit-messages-updated-influence-level dataset that can be used for Text Classification. This SetFit model uses sentence-transformers/all-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
0
  • 'The influence level of Understanding the effects of aging on the body is key for caregivers.'
  • 'The influence level of Regular check-ups are key to maintaining good health.'
  • 'The influence level of Balanced nutrition is key for maintaining health in old age.'
1
  • "The influence level of Time for your 3pm medication! Please take as directed. Friendly reminder: It's time for your 3pm medication. Ensure to take it as prescribed."
  • 'The influence level of Regular bladder function tests are important for elderly individuals.'
  • 'The influence level of How was your telehealth session? Share your feedback. Help us improve. Provide feedback on your recent telehealth appointment. '
2
  • "The influence level of A support group meeting is scheduled for tomorrow at 5pm. It's a great opportunity to share and learn. Connect with others in our support group meeting tomorrow. See you at 5pm!"
  • 'The influence level of Safety first! Please update your emergency contact details in our system. Ensure swift help when needed. Update your emergency contacts in our app. '
  • 'The influence level of Regularly discussing health concerns with doctors is important for the elderly.'
3
  • 'The influence level of Understanding the role of dietary supplements in elderly health is important.'
  • 'The influence level of Proper medication management is essential for effective treatment.'
  • "The influence level of Your child's health is paramount. Reminder for the pediatrician appointment tomorrow. Ensure the best for your child. Don't miss the pediatrician appointment set for tomorrow. "

Evaluation

Metrics

Label Accuracy
all 0.4737

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("konsman/setfit-messages-label-v2")
# Run inference
preds = model("The influence level of Regularly updating emergency contact information is important for the elderly.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 12 20.8438 36
Label Training Sample Count
0 8
1 8
2 8
3 8

Training Hyperparameters

  • batch_size: (8, 8)
  • num_epochs: (4, 4)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 40
  • body_learning_rate: (2.2041595048800003e-05, 2.2041595048800003e-05)
  • head_learning_rate: 2.2041595048800003e-05
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0031 1 0.1587 -
0.1562 50 0.116 -
0.3125 100 0.0918 -
0.4688 150 0.0042 -
0.625 200 0.0005 -
0.7812 250 0.0012 -
0.9375 300 0.0005 -
1.0938 350 0.0005 -
1.25 400 0.0003 -
1.4062 450 0.0002 -
1.5625 500 0.0002 -
1.7188 550 0.0001 -
1.875 600 0.0001 -
2.0312 650 0.0002 -
2.1875 700 0.0001 -
2.3438 750 0.0001 -
2.5 800 0.0001 -
2.6562 850 0.0001 -
2.8125 900 0.0001 -
2.9688 950 0.0001 -
3.125 1000 0.0002 -
3.2812 1050 0.0001 -
3.4375 1100 0.0001 -
3.5938 1150 0.0001 -
3.75 1200 0.0001 -
3.9062 1250 0.0001 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.2
  • Sentence Transformers: 2.2.2
  • Transformers: 4.35.2
  • PyTorch: 2.1.0+cu121
  • Datasets: 2.16.1
  • Tokenizers: 0.15.0

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}