--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_keras_callback model-index: - name: edyfjm07/distilbert-base-uncased-QA4-finetuned-squad-es results: [] datasets: - edyfjm07/squad_indicaciones_es language: - es metrics: - rouge - recall - accuracy - f1 --- # edyfjm07/distilbert-base-uncased-QA4-finetuned-squad-es This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0931 - Train End Logits Accuracy: 0.9559 - Train Start Logits Accuracy: 0.9685 - Validation Loss: 1.2632 - Validation End Logits Accuracy: 0.8088 - Validation Start Logits Accuracy: 0.8088 - Epoch: 45 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 1e-05, 'decay_steps': 5474, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False} - training_precision: float32 ### Training results | Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch | |:----------:|:-------------------------:|:---------------------------:|:---------------:|:------------------------------:|:--------------------------------:|:-----:| | 3.8949 | 0.1733 | 0.1891 | 2.4981 | 0.3918 | 0.3981 | 0 | | 2.0479 | 0.4097 | 0.4811 | 1.6575 | 0.4890 | 0.6113 | 1 | | 1.4343 | 0.5599 | 0.6166 | 1.3371 | 0.5768 | 0.6426 | 2 | | 1.0892 | 0.6313 | 0.6891 | 1.1850 | 0.6677 | 0.6865 | 3 | | 0.9172 | 0.6870 | 0.7405 | 1.1305 | 0.6771 | 0.7335 | 4 | | 0.7470 | 0.7258 | 0.7910 | 1.0674 | 0.7147 | 0.7524 | 5 | | 0.6728 | 0.7426 | 0.8088 | 1.0843 | 0.7116 | 0.7680 | 6 | | 0.5989 | 0.7721 | 0.8403 | 1.0787 | 0.7304 | 0.7649 | 7 | | 0.4988 | 0.8057 | 0.8582 | 1.1091 | 0.7398 | 0.7618 | 8 | | 0.4674 | 0.8214 | 0.8540 | 1.1150 | 0.7367 | 0.7774 | 9 | | 0.4173 | 0.8256 | 0.8782 | 1.1434 | 0.7335 | 0.7774 | 10 | | 0.3804 | 0.8319 | 0.8897 | 1.1256 | 0.7335 | 0.7900 | 11 | | 0.3831 | 0.8456 | 0.8834 | 1.1614 | 0.7429 | 0.7931 | 12 | | 0.3325 | 0.8550 | 0.9097 | 1.1519 | 0.7429 | 0.7900 | 13 | | 0.3115 | 0.8739 | 0.9076 | 1.1423 | 0.7586 | 0.7868 | 14 | | 0.2860 | 0.8792 | 0.9160 | 1.1335 | 0.7649 | 0.8025 | 15 | | 0.2751 | 0.8834 | 0.9181 | 1.1135 | 0.7712 | 0.8119 | 16 | | 0.2441 | 0.8918 | 0.9296 | 1.1771 | 0.7524 | 0.7900 | 17 | | 0.2342 | 0.9044 | 0.9370 | 1.1433 | 0.7680 | 0.8088 | 18 | | 0.2049 | 0.9254 | 0.9391 | 1.1689 | 0.7680 | 0.7994 | 19 | | 0.2029 | 0.9170 | 0.9475 | 1.1659 | 0.8025 | 0.8150 | 20 | | 0.1939 | 0.9170 | 0.9422 | 1.2030 | 0.7712 | 0.8150 | 21 | | 0.1787 | 0.9202 | 0.9548 | 1.2073 | 0.7806 | 0.8056 | 22 | | 0.2013 | 0.9233 | 0.9485 | 1.1615 | 0.7962 | 0.7994 | 23 | | 0.1821 | 0.9349 | 0.9443 | 1.1657 | 0.7806 | 0.8088 | 24 | | 0.1683 | 0.9328 | 0.9464 | 1.1684 | 0.7994 | 0.8088 | 25 | | 0.1568 | 0.9286 | 0.9580 | 1.1909 | 0.7900 | 0.8056 | 26 | | 0.1536 | 0.9244 | 0.9590 | 1.2054 | 0.7868 | 0.8182 | 27 | | 0.1221 | 0.9485 | 0.9601 | 1.1996 | 0.7806 | 0.8088 | 28 | | 0.1373 | 0.9349 | 0.9601 | 1.2201 | 0.7806 | 0.8056 | 29 | | 0.1334 | 0.9443 | 0.9569 | 1.2531 | 0.7868 | 0.8025 | 30 | | 0.1335 | 0.9422 | 0.9569 | 1.2030 | 0.7962 | 0.8088 | 31 | | 0.1157 | 0.9485 | 0.9590 | 1.2142 | 0.7931 | 0.8088 | 32 | | 0.1209 | 0.9475 | 0.9590 | 1.2215 | 0.7743 | 0.7994 | 33 | | 0.1149 | 0.9548 | 0.9653 | 1.2125 | 0.7806 | 0.8056 | 34 | | 0.1048 | 0.9538 | 0.9674 | 1.2632 | 0.7900 | 0.8056 | 35 | | 0.1056 | 0.9475 | 0.9706 | 1.2485 | 0.7931 | 0.8088 | 36 | | 0.0964 | 0.9653 | 0.9685 | 1.2468 | 0.7900 | 0.8088 | 37 | | 0.1000 | 0.9559 | 0.9664 | 1.2422 | 0.7962 | 0.8056 | 38 | | 0.0989 | 0.9601 | 0.9653 | 1.2620 | 0.8025 | 0.8056 | 39 | | 0.1024 | 0.9590 | 0.9674 | 1.2528 | 0.7994 | 0.8056 | 40 | | 0.0917 | 0.9548 | 0.9716 | 1.2506 | 0.7931 | 0.8088 | 41 | | 0.0913 | 0.9580 | 0.9685 | 1.2538 | 0.8025 | 0.8056 | 42 | | 0.0923 | 0.9664 | 0.9632 | 1.2619 | 0.8025 | 0.8056 | 43 | | 0.0921 | 0.9559 | 0.9643 | 1.2621 | 0.8056 | 0.8088 | 44 | | 0.0931 | 0.9559 | 0.9685 | 1.2632 | 0.8088 | 0.8088 | 45 | ### Framework versions - Transformers 4.41.2 - TensorFlow 2.15.0 - Datasets 2.20.0 - Tokenizers 0.19.1