mrm8488's picture
Update README.md
7b7ba1b
metadata
license: apache-2.0
lang:
  - en
tags:
  - generated_from_trainer
  - spam
  - spam detection
metrics:
  - precision
  - recall
  - accuracy
  - f1
datasets:
  - SetFit/enron_spam
model-index:
  - name: bert-tiny-finetuned-enron-spam-detection
    results: []
widget:
  - text: >-
      buy online and save viagra price for this high demand med best price for
      this high demand med best price for this high demand med buy nowbuy nowbuy
      price for this high demand med best price for this high demand med best
      price for this high demand med buy nowbuy nowbuy nowcialis soft price for
      this high demand med best price for this high demand med best price for
      this high demand med buy nowbuy nowbuy your penis width ( girth ) by 20 %
      gain up to 3 + full inches in length buy nowbuy now
  - text: >-
      aquila dave marks just got a call from someone at aquila saying they got a
      corporate - wide e - mail saying they shouldn ' t trade on enrononline
      anymore . - r

BERT-Tiny fine-tuned on Enron Spam Detection

This model is a fine-tuned version of google/bert_uncased_L-2_H-128_A-2 (aka BERT-Tiny) on an SetFit/enron_spam for Spam Dectection downstream task.

It achieves the following results on the evaluation set:

  • Loss: 0.0593
  • Precision: 0.9851
  • Recall: 0.9871
  • Accuracy: 0.986
  • F1: 0.9861

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 4

Training results

Training Loss Epoch Step Validation Loss Precision Recall Accuracy F1
0.1125 1.0 1983 0.0797 0.9839 0.9692 0.9765 0.9765
0.061 2.0 3966 0.0618 0.9822 0.9861 0.984 0.9842
0.0486 3.0 5949 0.0593 0.9851 0.9871 0.986 0.9861
0.048 4.0 7932 0.0588 0.9870 0.9821 0.9845 0.9846

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.1