edyfjm07's picture
Update README.md
0766a83 verified
metadata
license: apache-2.0
base_model: distilbert-base-uncased
tags:
  - generated_from_keras_callback
model-index:
  - name: edyfjm07/distilbert-base-uncased-QA1-finetuned-squad-es
    results: []
language:
  - es
metrics:
  - rouge
  - f1
datasets:
  - edyfjm07/squad_indicaciones_es
pipeline_tag: question-answering

edyfjm07/distilbert-base-uncased-QA1-finetuned-squad-es

This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.2131
  • Train End Logits Accuracy: 0.9224
  • Train Start Logits Accuracy: 0.9310
  • Validation Loss: 1.0588
  • Validation End Logits Accuracy: 0.8088
  • Validation Start Logits Accuracy: 0.8150
  • Epoch: 50

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 1e-05, 'decay_steps': 1479, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train End Logits Accuracy Train Start Logits Accuracy Validation Loss Validation End Logits Accuracy Validation Start Logits Accuracy Epoch
5.1787 0.0571 0.0496 4.3181 0.1724 0.1818 0
3.6307 0.25 0.1810 2.8944 0.3793 0.2476 1
2.5094 0.3998 0.3147 2.1436 0.4514 0.3793 2
1.9078 0.4871 0.4397 1.7322 0.5204 0.5705 3
1.5135 0.5593 0.5700 1.4332 0.6050 0.6238 4
1.2802 0.5927 0.6013 1.3274 0.6270 0.6364 5
1.1079 0.6595 0.6455 1.2126 0.6520 0.6865 6
0.9827 0.6843 0.7069 1.1469 0.7116 0.7116 7
0.8810 0.7306 0.7371 1.0859 0.7116 0.7053 8
0.8194 0.7349 0.7446 1.0339 0.7429 0.7492 9
0.7245 0.7403 0.7877 1.0371 0.7304 0.7398 10
0.6827 0.7683 0.7856 1.0185 0.7492 0.7461 11
0.6421 0.7866 0.8071 1.0298 0.7492 0.7555 12
0.5949 0.8006 0.8050 0.9877 0.7586 0.7774 13
0.5471 0.8125 0.8244 0.9933 0.7398 0.7774 14
0.5119 0.8233 0.8362 0.9956 0.7524 0.7837 15
0.4916 0.8330 0.8599 0.9917 0.7398 0.8025 16
0.4521 0.8373 0.8836 0.9698 0.7680 0.7868 17
0.4424 0.8459 0.8696 0.9951 0.7712 0.8025 18
0.3928 0.8599 0.8966 1.0173 0.7618 0.7931 19
0.3874 0.8578 0.8922 1.0307 0.7649 0.7931 20
0.3822 0.8588 0.8901 1.0272 0.7680 0.7900 21
0.3859 0.8524 0.8879 1.0180 0.7555 0.7962 22
0.3672 0.8524 0.8836 1.0040 0.7837 0.7994 23
0.3409 0.8675 0.8825 1.0242 0.7900 0.8088 24
0.3564 0.8610 0.8869 1.0257 0.7900 0.7900 25
0.3324 0.8578 0.9041 1.0227 0.7837 0.8088 26
0.3066 0.8858 0.9159 1.0243 0.7900 0.8025 27
0.3026 0.8804 0.9084 1.0224 0.7774 0.8088 28
0.2896 0.8879 0.9009 1.0324 0.7649 0.8182 29
0.2710 0.8998 0.9106 1.0458 0.7868 0.8088 30
0.2727 0.8933 0.9213 1.0483 0.7806 0.7931 31
0.2728 0.8976 0.9062 1.0459 0.7868 0.8088 32
0.2780 0.8847 0.9073 1.0595 0.7962 0.8056 33
0.2641 0.8955 0.9138 1.0503 0.7868 0.8025 34
0.2611 0.9009 0.9203 1.0458 0.8025 0.7962 35
0.2502 0.9030 0.9203 1.0621 0.8025 0.8025 36
0.2655 0.8804 0.9213 1.0478 0.7994 0.7994 37
0.2434 0.9084 0.9181 1.0491 0.8025 0.7994 38
0.2409 0.9149 0.9224 1.0452 0.8025 0.8088 39
0.2271 0.9181 0.9246 1.0487 0.7962 0.8119 40
0.2288 0.9332 0.9149 1.0579 0.8056 0.8056 41
0.2444 0.9127 0.9127 1.0522 0.8056 0.8119 42
0.2145 0.9235 0.9300 1.0584 0.8025 0.8088 43
0.2264 0.9073 0.9289 1.0520 0.8025 0.8119 44
0.2120 0.9213 0.9429 1.0591 0.8119 0.8088 45
0.2280 0.9127 0.9235 1.0538 0.8088 0.8056 46
0.2166 0.9116 0.9203 1.0554 0.8088 0.8088 47
0.2184 0.9138 0.9397 1.0568 0.8119 0.8088 48
0.2087 0.9106 0.9375 1.0588 0.8088 0.8150 49
0.2131 0.9224 0.9310 1.0588 0.8088 0.8150 50

Framework versions

  • Transformers 4.41.2
  • TensorFlow 2.15.0
  • Datasets 2.20.0
  • Tokenizers 0.19.1