Distilbert-Base-Uncased-Go-Emotion
Model description:
Not working fine
Training Parameters:
Num Epochs = 3
Instantaneous batch size per device = 32
Total train batch size (w. parallel, distributed & accumulation) = 32
Gradient Accumulation steps = 1
Total optimization steps = 15831
TrainOutput:
'train_loss': 0.105500
Evalution Output:
'eval_accuracy_thresh': 0.962023913860321,
'eval_loss': 0.11090277135372162,
Colab Notebook:
- Downloads last month
- 46
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.