--- license: apache-2.0 base_model: distilbert/distilroberta-base tags: - generated_from_trainer metrics: - accuracy model-index: - name: distilroberta_base_patent results: [] --- # distilroberta_base_patent This model is a fine-tuned version of [distilbert/distilroberta-base](https://huggingface.co/distilbert/distilroberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.0022 - Accuracy: 0.6596 - F1 Macro: 0.5725 - F1 Micro: 0.6596 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - distributed_type: multi-GPU - num_devices: 2 - total_train_batch_size: 64 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | F1 Micro | |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:--------:| | 1.5474 | 0.13 | 50 | 1.4682 | 0.4644 | 0.3007 | 0.4644 | | 1.2975 | 0.26 | 100 | 1.2702 | 0.5514 | 0.3857 | 0.5514 | | 1.277 | 0.38 | 150 | 1.1989 | 0.588 | 0.4213 | 0.588 | | 1.1483 | 0.51 | 200 | 1.1509 | 0.6018 | 0.4433 | 0.6018 | | 1.1909 | 0.64 | 250 | 1.1209 | 0.618 | 0.4785 | 0.618 | | 1.1243 | 0.77 | 300 | 1.1128 | 0.622 | 0.4930 | 0.622 | | 1.1353 | 0.9 | 350 | 1.1134 | 0.609 | 0.4930 | 0.609 | | 1.0636 | 1.02 | 400 | 1.0676 | 0.64 | 0.5189 | 0.64 | | 0.9667 | 1.15 | 450 | 1.0703 | 0.6404 | 0.5193 | 0.6404 | | 1.0063 | 1.28 | 500 | 1.0495 | 0.6386 | 0.5128 | 0.6386 | | 0.9521 | 1.41 | 550 | 1.0469 | 0.6432 | 0.5185 | 0.6432 | | 0.998 | 1.53 | 600 | 1.0359 | 0.6486 | 0.5357 | 0.6486 | | 1.0188 | 1.66 | 650 | 1.0530 | 0.6418 | 0.5395 | 0.6418 | | 0.9617 | 1.79 | 700 | 1.0214 | 0.6526 | 0.5307 | 0.6526 | | 1.0234 | 1.92 | 750 | 1.0148 | 0.6514 | 0.5495 | 0.6514 | | 0.8914 | 2.05 | 800 | 1.0132 | 0.6544 | 0.5603 | 0.6544 | | 0.9269 | 2.17 | 850 | 1.0110 | 0.6562 | 0.5647 | 0.6562 | | 1.0351 | 2.3 | 900 | 1.0124 | 0.6528 | 0.5717 | 0.6528 | | 0.9582 | 2.43 | 950 | 1.0150 | 0.6524 | 0.5552 | 0.6524 | | 0.8959 | 2.56 | 1000 | 1.0069 | 0.659 | 0.5741 | 0.659 | | 0.8342 | 2.69 | 1050 | 1.0031 | 0.6596 | 0.5794 | 0.6596 | | 0.883 | 2.81 | 1100 | 1.0042 | 0.6594 | 0.5767 | 0.6594 | | 0.9377 | 2.94 | 1150 | 1.0022 | 0.6596 | 0.5725 | 0.6596 | ### Framework versions - Transformers 4.39.0.dev0 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2