|
--- |
|
license: apache-2.0 |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- AdamCodd/emotion-balanced |
|
metrics: |
|
- accuracy |
|
- f1 |
|
- recall |
|
- precision |
|
widget: |
|
- text: "He looked out of the rain-streaked window, lost in thought, the faintest hint of melancholy in his eyes, as he remembered moments from a distant past." |
|
example_title: "Sadness sentence" |
|
- text: "As she strolled through the park, a soft smile played on her lips, and her heart felt lighter with each step, appreciating the simple beauty of nature." |
|
example_title: "Joy sentence" |
|
- text: "Their fingers brushed lightly as they exchanged a knowing glance, a subtle connection that spoke volumes about the deep affection they held for each other." |
|
example_title: "Love sentence" |
|
- text: "She clenched her fists and took a deep breath, trying to suppress the simmering frustration that welled up when her ideas were dismissed without consideration." |
|
example_title: "Anger sentence" |
|
- text: "In the quiet of the night, the gentle rustling of leaves outside her window sent shivers down her spine, leaving her feeling uneasy and vulnerable." |
|
example_title: "Fear sentence" |
|
- text: "Upon opening the old dusty book, a delicate, hand-painted map fell out, revealing hidden treasures she never expected to find." |
|
example_title: "Surprise sentence" |
|
base_model: distilbert-base-uncased |
|
model-index: |
|
- name: distilbert-base-uncased-finetuned-emotion-balanced |
|
results: |
|
- task: |
|
type: text-classification |
|
name: Text Classification |
|
dataset: |
|
name: emotion |
|
type: emotion |
|
args: default |
|
metrics: |
|
- type: accuracy |
|
value: 0.9521 |
|
name: Accuracy |
|
- type: loss |
|
value: 0.1216 |
|
name: Loss |
|
- type: f1 |
|
value: 0.9520944952964783 |
|
name: F1 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# distilbert-emotion |
|
|
|
<u><b>Reupload [10/02/23]</b></u> : The model has been retrained using identical hyperparameters, but this time on an even more pristine dataset, free of certain scraping artifacts. Remarkably, it maintains the same level of accuracy and loss while demonstrating superior generalization capabilities. |
|
|
|
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the [emotion balanced dataset](https://huggingface.co/datasets/AdamCodd/emotion-balanced). |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.1216 |
|
- Accuracy: 0.9521 |
|
|
|
<b>ONNX version</b>: [distilbert-base-uncased-finetuned-emotion-balanced-onnx](https://huggingface.co/AdamCodd/distilbert-base-uncased-finetuned-emotion-balanced-onnx) |
|
|
|
## Model description |
|
|
|
This emotion classifier has been trained on 89_754 examples split into train, validation and test. Each label was perfectly balanced in each split. |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 3e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 64 |
|
- seed: 1270 |
|
- optimizer: AdamW with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 150 |
|
- num_epochs: 3 |
|
- weight_decay: 0.01 |
|
|
|
### Training results |
|
|
|
precision recall f1-score support |
|
|
|
sadness 0.9882 0.9485 0.9679 1496 |
|
joy 0.9956 0.9057 0.9485 1496 |
|
love 0.9256 0.9980 0.9604 1496 |
|
anger 0.9628 0.9519 0.9573 1496 |
|
fear 0.9348 0.9098 0.9221 1496 |
|
surprise 0.9160 0.9987 0.9555 1496 |
|
|
|
accuracy 0.9521 8976 |
|
macro avg 0.9538 0.9521 0.9520 8976 |
|
weighted avg 0.9538 0.9521 0.9520 8976 |
|
|
|
test_acc: 0.9520944952964783 |
|
test_loss: 0.121663898229599 |
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.1 |
|
- Pytorch lightning 2.0.8 |
|
- Tokenizers 0.13.3 |