Distilbert-Base-Uncased-Go-Emotion
Model description:
Not working fine
Training Parameters:
Num Epochs = 3
Instantaneous batch size per device = 32
Total train batch size (w. parallel, distributed & accumulation) = 32
Gradient Accumulation steps = 1
Total optimization steps = 15831
TrainOutput:
'train_loss': 0.105500
Evalution Output:
'eval_accuracy_thresh': 0.962023913860321,
'eval_loss': 0.11090277135372162,
Colab Notebook:
- Downloads last month
- 18
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.