ConcSeqBERT-UCIRetail
This model is a fine-tuned version of google-bert/bert-base-multilingual-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4927
- Accuracy: 0.7743
- F1: 0.7692
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
No log | 1.0 | 456 | 0.5513 | 0.7545 | 0.7512 |
0.6706 | 2.0 | 912 | 0.5147 | 0.7685 | 0.7675 |
0.5362 | 3.0 | 1368 | 0.4927 | 0.7743 | 0.7692 |
0.4686 | 4.0 | 1824 | 0.6096 | 0.7372 | 0.7366 |
0.4149 | 5.0 | 2280 | 0.5597 | 0.7702 | 0.7686 |
0.3763 | 6.0 | 2736 | 0.8274 | 0.7759 | 0.7759 |
0.3642 | 7.0 | 3192 | 0.8380 | 0.7677 | 0.7669 |
0.3207 | 8.0 | 3648 | 0.9043 | 0.7661 | 0.7661 |
0.2837 | 9.0 | 4104 | 0.9605 | 0.7727 | 0.7722 |
0.2508 | 10.0 | 4560 | 1.0251 | 0.7677 | 0.7677 |
Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.0.0
- Datasets 2.14.5
- Tokenizers 0.14.1
- Downloads last month
- 1