bert-emotion / README.md
KhaldiAbderrhmane's picture
2nd LL
f1416da verified
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: dir
    results: []

bert-emotion

This model is a fine-tuned version of bert-base-uncased on the Emotions dataset from Kaggle, with the best results on that last it can also provide a verbose understanding of the general emotion themes of English text.

  • Loss: 0.1884
  • Accuracy: 0.936

Model description

This model is a simple Pytorch Custom Model that uses BERT to classify the emotions of a given text

Intended uses & limitations

  • It only supports English for now (am willing to add French next)
  • The input text has a limit in size, it can suit a mid-size paragraph easily but can't handle large documents (you can bypass this by dividing the document to paragraphs and make a weights summation)
  • The emotions it can recognize are limited (the 6 major emotions) so it can't detail to mixed psychological outcomes
  • Fine Tuning time : well we all know how BERT can be slow sometimes so i suggest for anyone who wants to develop over the idea to use DistelBERT for faster results

Training and evaluation data

This dataset contains two key columns: 'text' and 'label'. The 'label' column represents six different emotion classes: sadness (0), joy (1), love (2), anger (3), fear (4), and surprise (5). Get ready to dive deep into the world of human emotions

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.9469 1.0 625 0.2593 0.9202
0.2403 2.0 1250 0.2080 0.9302
0.1726 3.0 1875 0.1884 0.936

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2