--- license: apache-2.0 base_model: distilroberta-base tags: - generated_from_trainer model-index: - name: modelo_entrenado_02 results: [] --- # modelo_entrenado_02 This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.1244 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 2 | 2.7577 | | No log | 2.0 | 4 | 2.9781 | | No log | 3.0 | 6 | 2.8632 | | No log | 4.0 | 8 | 2.8579 | | No log | 5.0 | 10 | 3.0384 | | No log | 6.0 | 12 | 2.7311 | | No log | 7.0 | 14 | 3.2083 | | No log | 8.0 | 16 | 2.7743 | | No log | 9.0 | 18 | 2.9103 | | No log | 10.0 | 20 | 2.9129 | | No log | 11.0 | 22 | 2.8430 | | No log | 12.0 | 24 | 3.0835 | | No log | 13.0 | 26 | 2.3511 | | No log | 14.0 | 28 | 2.7013 | | No log | 15.0 | 30 | 3.1549 | | No log | 16.0 | 32 | 2.6080 | | No log | 17.0 | 34 | 2.8091 | | No log | 18.0 | 36 | 2.9793 | | No log | 19.0 | 38 | 3.0124 | | No log | 20.0 | 40 | 3.3556 | | No log | 21.0 | 42 | 3.0714 | | No log | 22.0 | 44 | 2.7299 | | No log | 23.0 | 46 | 3.0126 | | No log | 24.0 | 48 | 2.8463 | | No log | 25.0 | 50 | 2.7301 | | No log | 26.0 | 52 | 3.0173 | | No log | 27.0 | 54 | 2.7944 | | No log | 28.0 | 56 | 2.9017 | | No log | 29.0 | 58 | 2.8487 | | No log | 30.0 | 60 | 2.6775 | | No log | 31.0 | 62 | 2.8994 | | No log | 32.0 | 64 | 2.9530 | | No log | 33.0 | 66 | 2.9083 | | No log | 34.0 | 68 | 3.0679 | | No log | 35.0 | 70 | 3.2776 | | No log | 36.0 | 72 | 3.1745 | | No log | 37.0 | 74 | 3.1096 | | No log | 38.0 | 76 | 2.9348 | | No log | 39.0 | 78 | 2.8553 | | No log | 40.0 | 80 | 2.8863 | | No log | 41.0 | 82 | 3.4298 | | No log | 42.0 | 84 | 2.6532 | | No log | 43.0 | 86 | 3.0342 | | No log | 44.0 | 88 | 2.9126 | | No log | 45.0 | 90 | 3.0540 | | No log | 46.0 | 92 | 2.5645 | | No log | 47.0 | 94 | 2.5397 | | No log | 48.0 | 96 | 2.9576 | | No log | 49.0 | 98 | 3.0156 | | No log | 50.0 | 100 | 2.9392 | | No log | 51.0 | 102 | 3.0397 | | No log | 52.0 | 104 | 2.6477 | | No log | 53.0 | 106 | 3.2235 | | No log | 54.0 | 108 | 2.8603 | | No log | 55.0 | 110 | 3.0346 | | No log | 56.0 | 112 | 3.1096 | | No log | 57.0 | 114 | 3.3671 | | No log | 58.0 | 116 | 3.1535 | | No log | 59.0 | 118 | 2.7483 | | No log | 60.0 | 120 | 3.1617 | | No log | 61.0 | 122 | 3.2076 | | No log | 62.0 | 124 | 3.0277 | | No log | 63.0 | 126 | 3.1467 | | No log | 64.0 | 128 | 2.9233 | | No log | 65.0 | 130 | 3.1549 | | No log | 66.0 | 132 | 3.0969 | | No log | 67.0 | 134 | 3.0112 | | No log | 68.0 | 136 | 3.1946 | | No log | 69.0 | 138 | 3.2482 | | No log | 70.0 | 140 | 3.0938 | | No log | 71.0 | 142 | 3.2812 | | No log | 72.0 | 144 | 3.1010 | | No log | 73.0 | 146 | 2.9116 | | No log | 74.0 | 148 | 2.9614 | | No log | 75.0 | 150 | 2.8463 | | No log | 76.0 | 152 | 3.0709 | | No log | 77.0 | 154 | 2.9854 | | No log | 78.0 | 156 | 2.8761 | | No log | 79.0 | 158 | 3.5857 | | No log | 80.0 | 160 | 3.1981 | | No log | 81.0 | 162 | 3.0583 | | No log | 82.0 | 164 | 2.9779 | | No log | 83.0 | 166 | 3.3871 | | No log | 84.0 | 168 | 3.2019 | | No log | 85.0 | 170 | 2.9458 | | No log | 86.0 | 172 | 2.9492 | | No log | 87.0 | 174 | 3.5746 | | No log | 88.0 | 176 | 3.2414 | | No log | 89.0 | 178 | 3.1301 | | No log | 90.0 | 180 | 3.0594 | | No log | 91.0 | 182 | 3.1043 | | No log | 92.0 | 184 | 3.3868 | | No log | 93.0 | 186 | 3.0604 | | No log | 94.0 | 188 | 2.9658 | | No log | 95.0 | 190 | 3.2042 | | No log | 96.0 | 192 | 3.2314 | | No log | 97.0 | 194 | 3.2674 | | No log | 98.0 | 196 | 3.1568 | | No log | 99.0 | 198 | 3.2152 | | No log | 100.0 | 200 | 3.1337 | ### Framework versions - Transformers 4.41.0 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1