bdotloh's picture
Update README.md
53eb945
---
language: en
tags:
- emotion-classification
datasets:
- go-emotions
- bdotloh/empathetic-dialogues-contexts
---
# Model Description
Yet another Transformer model fine-tuned for approximating another non-linear mapping between X and Y? That's right!
This is your good ol' emotion classifier - given an input text, the model outputs a probability distribution over a set of pre-selected emotion words. In this case, it is 32, which is the number of emotion classes in the [Empathetic Dialogues](https://huggingface.co/datasets/bdotloh/empathetic-dialogues-contexts) dataset.
This model is built "on top of" a [distilbert-base-uncased model fine-tuned on the go-emotions dataset](https://huggingface.co/bhadresh-savani/bert-base-go-emotion). Y'all should really check out that model, it even contains a jupyter notebook file that illustrates how the model was trained (bhadresh-savani if you see this, thank you!).
## Training data
## Training procedure
### Preprocessing
## Evaluation results
### Limitations and bias
Well where should we begin...
EmpatheticDialogues:
1) Unable to ascertain the degree of cultural specificity for the context that a respondent described when given an emotion label (i.e., p(description | emotion, *culture*))
2) ...