--- license: apache-2.0 base_model: distilroberta-base tags: - generated_from_trainer model-index: - name: modelo_entrenado_02 results: [] --- # modelo_entrenado_02 This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.9377 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 2 | 2.4285 | | No log | 2.0 | 4 | 2.5700 | | No log | 3.0 | 6 | 2.3661 | | No log | 4.0 | 8 | 2.7635 | | No log | 5.0 | 10 | 2.6494 | | No log | 6.0 | 12 | 2.5599 | | No log | 7.0 | 14 | 2.8686 | | No log | 8.0 | 16 | 2.9705 | | No log | 9.0 | 18 | 2.6723 | | No log | 10.0 | 20 | 2.5567 | | No log | 11.0 | 22 | 2.4658 | | No log | 12.0 | 24 | 2.5316 | | No log | 13.0 | 26 | 2.6476 | | No log | 14.0 | 28 | 2.4711 | | No log | 15.0 | 30 | 2.7310 | | No log | 16.0 | 32 | 2.8316 | | No log | 17.0 | 34 | 2.7867 | | No log | 18.0 | 36 | 2.3467 | | No log | 19.0 | 38 | 3.1519 | | No log | 20.0 | 40 | 2.9244 | | No log | 21.0 | 42 | 2.7193 | | No log | 22.0 | 44 | 2.8540 | | No log | 23.0 | 46 | 2.7119 | | No log | 24.0 | 48 | 2.9885 | | No log | 25.0 | 50 | 2.7189 | | No log | 26.0 | 52 | 2.7430 | | No log | 27.0 | 54 | 2.8829 | | No log | 28.0 | 56 | 2.7182 | | No log | 29.0 | 58 | 2.9149 | | No log | 30.0 | 60 | 2.7041 | | No log | 31.0 | 62 | 2.8468 | | No log | 32.0 | 64 | 2.9701 | | No log | 33.0 | 66 | 2.5975 | | No log | 34.0 | 68 | 2.9569 | | No log | 35.0 | 70 | 2.8847 | | No log | 36.0 | 72 | 3.2673 | | No log | 37.0 | 74 | 3.0552 | | No log | 38.0 | 76 | 2.8077 | | No log | 39.0 | 78 | 2.9599 | | No log | 40.0 | 80 | 2.9013 | | No log | 41.0 | 82 | 3.0198 | | No log | 42.0 | 84 | 2.6469 | | No log | 43.0 | 86 | 3.1038 | | No log | 44.0 | 88 | 2.8045 | | No log | 45.0 | 90 | 2.8941 | | No log | 46.0 | 92 | 2.6583 | | No log | 47.0 | 94 | 2.5010 | | No log | 48.0 | 96 | 2.8845 | | No log | 49.0 | 98 | 2.8941 | | No log | 50.0 | 100 | 2.9485 | | No log | 51.0 | 102 | 2.6861 | | No log | 52.0 | 104 | 2.8158 | | No log | 53.0 | 106 | 2.7242 | | No log | 54.0 | 108 | 2.8971 | | No log | 55.0 | 110 | 2.7668 | | No log | 56.0 | 112 | 2.8346 | | No log | 57.0 | 114 | 2.8979 | | No log | 58.0 | 116 | 2.6177 | | No log | 59.0 | 118 | 2.7576 | | No log | 60.0 | 120 | 2.6442 | | No log | 61.0 | 122 | 2.9222 | | No log | 62.0 | 124 | 2.6028 | | No log | 63.0 | 126 | 3.3076 | | No log | 64.0 | 128 | 2.6238 | | No log | 65.0 | 130 | 2.8379 | | No log | 66.0 | 132 | 2.7671 | | No log | 67.0 | 134 | 2.9184 | | No log | 68.0 | 136 | 2.8382 | | No log | 69.0 | 138 | 2.7783 | | No log | 70.0 | 140 | 2.9365 | | No log | 71.0 | 142 | 2.9453 | | No log | 72.0 | 144 | 2.6354 | | No log | 73.0 | 146 | 2.6368 | | No log | 74.0 | 148 | 2.9489 | | No log | 75.0 | 150 | 2.6480 | | No log | 76.0 | 152 | 3.0972 | | No log | 77.0 | 154 | 2.9844 | | No log | 78.0 | 156 | 2.7689 | | No log | 79.0 | 158 | 2.8069 | | No log | 80.0 | 160 | 2.7575 | | No log | 81.0 | 162 | 2.6872 | | No log | 82.0 | 164 | 3.0615 | | No log | 83.0 | 166 | 3.0133 | | No log | 84.0 | 168 | 2.8961 | | No log | 85.0 | 170 | 2.9905 | | No log | 86.0 | 172 | 2.9221 | | No log | 87.0 | 174 | 2.9477 | | No log | 88.0 | 176 | 3.1695 | | No log | 89.0 | 178 | 2.9689 | | No log | 90.0 | 180 | 2.8121 | | No log | 91.0 | 182 | 2.9417 | | No log | 92.0 | 184 | 3.2368 | | No log | 93.0 | 186 | 3.1631 | | No log | 94.0 | 188 | 2.8032 | | No log | 95.0 | 190 | 2.8622 | | No log | 96.0 | 192 | 2.9860 | | No log | 97.0 | 194 | 2.9899 | | No log | 98.0 | 196 | 3.0396 | | No log | 99.0 | 198 | 2.8936 | | No log | 100.0 | 200 | 2.9383 | ### Framework versions - Transformers 4.41.0 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1