legal_bert_sm_cv_summarized_defined_4
This model is a fine-tuned version of nlpaueb/legal-bert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.6665
- Accuracy: 0.811
- Precision: 0.5357
- Recall: 0.2308
- F1: 0.3226
- D-index: 1.5269
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 8000
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | D-index |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 250 | 0.4854 | 0.804 | 0.0 | 0.0 | 0.0 | 1.4356 |
0.5596 | 2.0 | 500 | 0.4545 | 0.805 | 0.0 | 0.0 | 0.0 | 1.4370 |
0.5596 | 3.0 | 750 | 0.4570 | 0.811 | 0.6667 | 0.0615 | 0.1127 | 1.4675 |
0.4293 | 4.0 | 1000 | 0.4673 | 0.815 | 0.6316 | 0.1231 | 0.2060 | 1.4949 |
0.4293 | 5.0 | 1250 | 0.4893 | 0.828 | 0.6949 | 0.2103 | 0.3228 | 1.5429 |
0.3311 | 6.0 | 1500 | 0.5062 | 0.828 | 0.6533 | 0.2513 | 0.3630 | 1.5569 |
0.3311 | 7.0 | 1750 | 0.5584 | 0.826 | 0.7059 | 0.1846 | 0.2927 | 1.5313 |
0.2126 | 8.0 | 2000 | 0.7423 | 0.821 | 0.6333 | 0.1949 | 0.2980 | 1.5281 |
0.2126 | 9.0 | 2250 | 0.8720 | 0.804 | 0.4933 | 0.1897 | 0.2741 | 1.5031 |
0.1327 | 10.0 | 2500 | 0.9116 | 0.811 | 0.5268 | 0.3026 | 0.3844 | 1.5513 |
0.1327 | 11.0 | 2750 | 0.9677 | 0.8 | 0.4809 | 0.3231 | 0.3865 | 1.5434 |
0.0803 | 12.0 | 3000 | 1.1951 | 0.795 | 0.4627 | 0.3179 | 0.3769 | 1.5349 |
0.0803 | 13.0 | 3250 | 1.3724 | 0.819 | 0.5946 | 0.2256 | 0.3271 | 1.5360 |
0.0584 | 14.0 | 3500 | 1.4260 | 0.806 | 0.5056 | 0.2308 | 0.3169 | 1.5201 |
0.0584 | 15.0 | 3750 | 1.4684 | 0.812 | 0.5327 | 0.2923 | 0.3775 | 1.5492 |
0.0437 | 16.0 | 4000 | 1.5562 | 0.815 | 0.5658 | 0.2205 | 0.3173 | 1.5288 |
0.0437 | 17.0 | 4250 | 1.5812 | 0.814 | 0.5763 | 0.1744 | 0.2677 | 1.5115 |
0.0357 | 18.0 | 4500 | 1.6058 | 0.805 | 0.5 | 0.2308 | 0.3158 | 1.5187 |
0.0357 | 19.0 | 4750 | 1.6784 | 0.813 | 0.5465 | 0.2410 | 0.3345 | 1.5331 |
0.0334 | 20.0 | 5000 | 1.6665 | 0.811 | 0.5357 | 0.2308 | 0.3226 | 1.5269 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 9