File size: 9,947 Bytes
fab33af
 
 
 
 
 
 
 
9f2e23f
 
 
 
 
 
 
 
 
fab33af
 
 
 
 
 
 
 
 
9bfa757
 
 
 
6afab84
ee9b904
9bfa757
fab33af
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
325b99f
29670e9
a59d310
619c163
08e7799
97a071f
f12ea6d
e1c6c8a
ab9f58c
67b42d7
2cfcdfb
338aa6f
f88f52e
b57901c
b5767c7
287d9da
616cc95
24c0d4c
29c414a
a44396d
22e6c0e
f9a94a3
 
0189c1a
0285e33
3a89dfb
84ba101
14d0f32
1e0c319
5f07f9b
779a6c7
351dbc2
66fa104
7524eb0
45858ac
89ffa68
231e90a
8c46e33
9bbc112
c743eab
414f335
7e60977
ee9b904
6afab84
9bfa757
fab33af
 
 
 
 
 
 
9f2e23f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: edyfjm07/distilbert-base-uncased-QA2-finetuned-squad-es
  results: []
datasets:
- edyfjm07/squad_indicaciones_es
language:
- es
metrics:
- rouge
- f1
- recall
- accuracy
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# edyfjm07/distilbert-base-uncased-QA2-finetuned-squad-es

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0138
- Train End Logits Accuracy: 0.9947
- Train Start Logits Accuracy: 1.0
- Validation Loss: 1.7511
- Validation End Logits Accuracy: 0.7931
- Validation Start Logits Accuracy: 0.7994
- Epoch: 45

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 0.0001, 'decay_steps': 5474, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch |
|:----------:|:-------------------------:|:---------------------------:|:---------------:|:------------------------------:|:--------------------------------:|:-----:|
| 2.3428     | 0.4160                    | 0.4317                      | 1.3438          | 0.5611                         | 0.6458                           | 0     |
| 1.1526     | 0.6261                    | 0.6397                      | 1.0597          | 0.6677                         | 0.7429                           | 1     |
| 0.7612     | 0.7269                    | 0.7647                      | 1.0245          | 0.7210                         | 0.7806                           | 2     |
| 0.5528     | 0.7836                    | 0.8319                      | 1.2436          | 0.7116                         | 0.7712                           | 3     |
| 0.4667     | 0.8340                    | 0.8435                      | 1.0705          | 0.7524                         | 0.7555                           | 4     |
| 0.3834     | 0.8813                    | 0.8687                      | 1.1209          | 0.7586                         | 0.7712                           | 5     |
| 0.3678     | 0.8634                    | 0.8876                      | 1.2341          | 0.7618                         | 0.7649                           | 6     |
| 0.2555     | 0.9044                    | 0.9181                      | 1.1561          | 0.7649                         | 0.8056                           | 7     |
| 0.2151     | 0.9160                    | 0.9328                      | 1.0908          | 0.7931                         | 0.7994                           | 8     |
| 0.1855     | 0.9286                    | 0.9475                      | 1.2809          | 0.7994                         | 0.7774                           | 9     |
| 0.1654     | 0.9443                    | 0.9454                      | 1.3974          | 0.7837                         | 0.7806                           | 10    |
| 0.1282     | 0.9464                    | 0.9517                      | 1.4260          | 0.7774                         | 0.7837                           | 11    |
| 0.1313     | 0.9443                    | 0.9601                      | 1.4537          | 0.7900                         | 0.7962                           | 12    |
| 0.1301     | 0.9517                    | 0.9590                      | 1.1851          | 0.7774                         | 0.8150                           | 13    |
| 0.1089     | 0.9548                    | 0.9590                      | 1.2442          | 0.7774                         | 0.8088                           | 14    |
| 0.1023     | 0.9601                    | 0.9622                      | 1.4575          | 0.7931                         | 0.7931                           | 15    |
| 0.0956     | 0.9590                    | 0.9685                      | 1.5160          | 0.7837                         | 0.7900                           | 16    |
| 0.0712     | 0.9727                    | 0.9737                      | 1.5741          | 0.7900                         | 0.8088                           | 17    |
| 0.0752     | 0.9674                    | 0.9790                      | 1.4401          | 0.7931                         | 0.7994                           | 18    |
| 0.0604     | 0.9737                    | 0.9779                      | 1.6410          | 0.7962                         | 0.8088                           | 19    |
| 0.0497     | 0.9758                    | 0.9821                      | 1.5655          | 0.7962                         | 0.8119                           | 20    |
| 0.0668     | 0.9685                    | 0.9811                      | 1.3480          | 0.7806                         | 0.7962                           | 21    |
| 0.0567     | 0.9769                    | 0.9800                      | 1.3820          | 0.7900                         | 0.8088                           | 22    |
| 0.0550     | 0.9769                    | 0.9832                      | 1.3593          | 0.7806                         | 0.8056                           | 23    |
| 0.0399     | 0.9821                    | 0.9884                      | 1.5254          | 0.7868                         | 0.7931                           | 24    |
| 0.0320     | 0.9842                    | 0.9874                      | 1.5801          | 0.7868                         | 0.7994                           | 25    |
| 0.0296     | 0.9832                    | 0.9884                      | 1.6310          | 0.7962                         | 0.7962                           | 26    |
| 0.0307     | 0.9863                    | 0.9926                      | 1.4756          | 0.7774                         | 0.7900                           | 27    |
| 0.0254     | 0.9863                    | 0.9895                      | 1.7564          | 0.7774                         | 0.7931                           | 28    |
| 0.0255     | 0.9853                    | 0.9937                      | 1.6061          | 0.7774                         | 0.7962                           | 29    |
| 0.0214     | 0.9863                    | 0.9937                      | 1.7697          | 0.7712                         | 0.8056                           | 30    |
| 0.0283     | 0.9842                    | 0.9863                      | 1.8398          | 0.7806                         | 0.7900                           | 31    |
| 0.0182     | 0.9905                    | 0.9926                      | 1.8756          | 0.7837                         | 0.7994                           | 32    |
| 0.0252     | 0.9832                    | 0.9947                      | 1.8182          | 0.7837                         | 0.7962                           | 33    |
| 0.0222     | 0.9863                    | 0.9947                      | 1.7854          | 0.7837                         | 0.7931                           | 34    |
| 0.0216     | 0.9884                    | 0.9947                      | 1.5707          | 0.7931                         | 0.8025                           | 35    |
| 0.0161     | 0.9937                    | 0.9916                      | 1.7071          | 0.7806                         | 0.8025                           | 36    |
| 0.0146     | 0.9926                    | 0.9926                      | 1.7827          | 0.7868                         | 0.7962                           | 37    |
| 0.0148     | 0.9905                    | 0.9947                      | 1.8678          | 0.7868                         | 0.7931                           | 38    |
| 0.0117     | 0.9884                    | 0.9968                      | 1.7944          | 0.7868                         | 0.7900                           | 39    |
| 0.0137     | 0.9905                    | 0.9958                      | 1.7666          | 0.7900                         | 0.7931                           | 40    |
| 0.0160     | 0.9874                    | 0.9958                      | 1.7644          | 0.7868                         | 0.7962                           | 41    |
| 0.0150     | 0.9916                    | 0.9937                      | 1.7783          | 0.7868                         | 0.8025                           | 42    |
| 0.0128     | 0.9895                    | 0.9958                      | 1.7480          | 0.7900                         | 0.7994                           | 43    |
| 0.0102     | 0.9937                    | 0.9947                      | 1.7432          | 0.7931                         | 0.7994                           | 44    |
| 0.0138     | 0.9947                    | 1.0                         | 1.7511          | 0.7931                         | 0.7994                           | 45    |


### Framework versions

- Transformers 4.41.2
- TensorFlow 2.15.0
- Datasets 2.20.0
- Tokenizers 0.19.1