--- license: mit tags: - generated_from_trainer metrics: - accuracy model-index: - name: MiniLM-evidence-types results: [] --- # MiniLM-evidence-types This model is a fine-tuned version of [microsoft/MiniLM-L12-H384-uncased](https://huggingface.co/microsoft/MiniLM-L12-H384-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.0233 - Macro f1: 0.3675 - Weighted f1: 0.6815 - Accuracy: 0.6948 - Balanced accuracy: 0.3520 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Macro f1 | Weighted f1 | Accuracy | Balanced accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------:|:--------:|:-----------------:| | 1.3773 | 1.0 | 125 | 1.2259 | 0.1981 | 0.6131 | 0.6819 | 0.2171 | | 1.156 | 2.0 | 250 | 1.1316 | 0.2898 | 0.6207 | 0.6636 | 0.3052 | | 1.0304 | 3.0 | 375 | 1.1232 | 0.2515 | 0.6382 | 0.6461 | 0.2741 | | 0.8953 | 4.0 | 500 | 1.0837 | 0.2739 | 0.6950 | 0.7131 | 0.2830 | | 0.7685 | 5.0 | 625 | 1.1225 | 0.3440 | 0.6965 | 0.7207 | 0.3420 | | 0.6505 | 6.0 | 750 | 1.1907 | 0.3380 | 0.6814 | 0.6963 | 0.3376 | | 0.5534 | 7.0 | 875 | 1.2381 | 0.3348 | 0.6932 | 0.7139 | 0.3296 | | 0.4729 | 8.0 | 1000 | 1.3227 | 0.3117 | 0.6929 | 0.7161 | 0.3013 | | 0.4205 | 9.0 | 1125 | 1.4013 | 0.3374 | 0.6793 | 0.6925 | 0.3298 | | 0.3618 | 10.0 | 1250 | 1.4847 | 0.3623 | 0.6963 | 0.7131 | 0.3385 | | 0.3165 | 11.0 | 1375 | 1.5459 | 0.3507 | 0.6732 | 0.6842 | 0.3387 | | 0.2759 | 12.0 | 1500 | 1.5969 | 0.3556 | 0.6861 | 0.7032 | 0.3406 | | 0.2474 | 13.0 | 1625 | 1.7362 | 0.3559 | 0.6795 | 0.6880 | 0.3448 | | 0.2187 | 14.0 | 1750 | 1.8644 | 0.3460 | 0.6786 | 0.6979 | 0.3262 | | 0.2144 | 15.0 | 1875 | 1.8729 | 0.3478 | 0.6830 | 0.7032 | 0.3289 | | 0.1911 | 16.0 | 2000 | 1.8958 | 0.3620 | 0.6765 | 0.6834 | 0.3609 | | 0.1858 | 17.0 | 2125 | 1.9366 | 0.3662 | 0.6815 | 0.6933 | 0.3535 | | 0.1579 | 18.0 | 2250 | 2.0065 | 0.3624 | 0.6820 | 0.6979 | 0.3442 | | 0.1492 | 19.0 | 2375 | 2.0467 | 0.3577 | 0.6786 | 0.6963 | 0.3373 | | 0.1527 | 20.0 | 2500 | 2.0233 | 0.3675 | 0.6815 | 0.6948 | 0.3520 | ### Framework versions - Transformers 4.19.2 - Pytorch 1.11.0+cu113 - Datasets 2.2.2 - Tokenizers 0.12.1