File size: 1,011 Bytes
85b32b8 7b03907 ffb726a 7b03907 ffb726a 7b03907 ffb726a 7b03907 ffb726a 7b03907 ffb726a 7b03907 85b32b8 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 |
---
license: mit
language:
- en
metrics:
- accuracy
- f1
- precision
- recall
library_name: transformers
pipeline_tag: text-classification
---
# Total Samples
Samples: 716017(Train+Test)
Training Samples: 579973
Validation Samples :64442
Test Samples :71602
# Overall Metrics
Accuracy :92%
F1 Score:92%
Recall:92%
Precisison: 92%
# Fine Tune Parameters
No of epochs: 3
Batch Size: 16
evaluation startegy: epoch
optimiser:Adamw
learning_rate:2e-5
max_steps:1000
warmup_step: 100
Monitoring Train & Evaluation:WANDB API
# Train
train_runtime': 1594.4072, 'train_samples_per_second': 80.281, 'train_steps_per_second': 0.627, 'total_flos': 5589761482241280.0, 'train_loss': 0.26639655661582945, 'epoch': 0.22
# Validation
'eval_loss': 0.22991116344928741,'eval_accuracy': 0.9211073523478477,'eval_precision': 0.9213582014463746,'eval_recall': 0.921107352347847'eval_f1': 0.9210970707304227,
'eval_runtime': 238.5409,'eval_samples_per_second': 270.151,'eval_steps_per_second': 8.443,'epoch': 0.22 |