--- license: apache-2.0 base_model: distilbert/distilroberta-base tags: - generated_from_trainer metrics: - accuracy model-index: - name: distilroberta_base_ledgar results: [] --- # distilroberta_base_ledgar This model is a fine-tuned version of [distilbert/distilroberta-base](https://huggingface.co/distilbert/distilroberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.6728 - Accuracy: 0.8262 - F1 Macro: 0.6967 - F1 Micro: 0.8262 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - distributed_type: multi-GPU - num_devices: 2 - total_train_batch_size: 64 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | F1 Micro | |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:--------:| | 3.3872 | 0.11 | 100 | 3.0589 | 0.45 | 0.1951 | 0.45 | | 2.4052 | 0.21 | 200 | 2.1783 | 0.6026 | 0.3248 | 0.6026 | | 1.9345 | 0.32 | 300 | 1.7352 | 0.6639 | 0.4073 | 0.6639 | | 1.605 | 0.43 | 400 | 1.4555 | 0.7138 | 0.4817 | 0.7138 | | 1.4359 | 0.53 | 500 | 1.2634 | 0.7429 | 0.5382 | 0.7429 | | 1.3107 | 0.64 | 600 | 1.1391 | 0.7678 | 0.5849 | 0.7678 | | 1.1656 | 0.75 | 700 | 1.0473 | 0.7775 | 0.6012 | 0.7775 | | 1.1157 | 0.85 | 800 | 0.9757 | 0.7801 | 0.6054 | 0.7801 | | 1.0035 | 0.96 | 900 | 0.9160 | 0.7934 | 0.6256 | 0.7934 | | 0.9232 | 1.07 | 1000 | 0.8697 | 0.8008 | 0.6364 | 0.8008 | | 0.9007 | 1.17 | 1100 | 0.8374 | 0.8057 | 0.6479 | 0.8057 | | 0.9422 | 1.28 | 1200 | 0.8185 | 0.8078 | 0.6542 | 0.8078 | | 0.8607 | 1.39 | 1300 | 0.7933 | 0.8093 | 0.6593 | 0.8093 | | 0.7426 | 1.49 | 1400 | 0.7753 | 0.8098 | 0.6654 | 0.8098 | | 0.7741 | 1.6 | 1500 | 0.7569 | 0.8122 | 0.6666 | 0.8122 | | 0.8094 | 1.71 | 1600 | 0.7388 | 0.8184 | 0.6773 | 0.8184 | | 0.7809 | 1.81 | 1700 | 0.7321 | 0.8172 | 0.6789 | 0.8172 | | 0.7435 | 1.92 | 1800 | 0.7198 | 0.8182 | 0.6775 | 0.8182 | | 0.718 | 2.03 | 1900 | 0.7103 | 0.8201 | 0.6810 | 0.8201 | | 0.6816 | 2.13 | 2000 | 0.7006 | 0.8208 | 0.6828 | 0.8208 | | 0.7262 | 2.24 | 2100 | 0.6982 | 0.8233 | 0.6907 | 0.8233 | | 0.683 | 2.35 | 2200 | 0.6932 | 0.8244 | 0.6917 | 0.8244 | | 0.6892 | 2.45 | 2300 | 0.6871 | 0.8238 | 0.6902 | 0.8238 | | 0.6712 | 2.56 | 2400 | 0.6783 | 0.8271 | 0.6975 | 0.8271 | | 0.6442 | 2.67 | 2500 | 0.6761 | 0.8263 | 0.6938 | 0.8263 | | 0.6847 | 2.77 | 2600 | 0.6751 | 0.8258 | 0.6954 | 0.8258 | | 0.6466 | 2.88 | 2700 | 0.6746 | 0.8264 | 0.6960 | 0.8264 | | 0.6402 | 2.99 | 2800 | 0.6728 | 0.8262 | 0.6967 | 0.8262 | ### Framework versions - Transformers 4.39.0.dev0 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2