File size: 9,108 Bytes
d79f8d9
 
 
 
 
 
 
 
f9f393b
 
 
 
 
 
 
 
 
d79f8d9
 
 
 
 
 
 
 
 
6431da7
 
 
39348f3
29bbd7f
6431da7
 
d79f8d9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0a0f4e1
39348f3
 
15be696
44cd5d6
d96aefc
 
9b9bdc5
7e00a8a
f6a1fca
49014fd
7e336f0
780d441
 
8d30e21
29f2455
1e2bc35
cb1c1f5
fea3481
e478490
9f9ad87
5730547
95a33a7
0cb6dcf
5d49ecc
 
5435705
9e4fa84
9a8d2cb
14920e0
1c38e28
4ba9cb0
d0acacd
d797489
679f66e
10a0bf6
368b184
a51cc31
29bbd7f
6431da7
d79f8d9
 
 
 
 
 
 
f9f393b
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: edyfjm07/distilbert-base-uncased-QA3-finetuned-squad-es
  results: []
datasets:
- edyfjm07/squad_indicaciones_es
language:
- es
metrics:
- rouge
- recall
- accuracy
- f1
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# edyfjm07/distilbert-base-uncased-QA3-finetuned-squad-es

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 5.9545
- Train End Logits Accuracy: 0.0032
- Train Start Logits Accuracy: 0.0
- Validation Loss: 5.9506
- Validation End Logits Accuracy: 0.0
- Validation Start Logits Accuracy: 0.0063
- Epoch: 40

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 0.001, 'decay_steps': 2419, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch |
|:----------:|:-------------------------:|:---------------------------:|:---------------:|:------------------------------:|:--------------------------------:|:-----:|
| 4.7467     | 0.1006                    | 0.0561                      | 5.8046          | 0.0157                         | 0.0878                           | 0     |
| 4.8045     | 0.0148                    | 0.0138                      | 5.2042          | 0.0094                         | 0.0094                           | 1     |
| 5.9402     | 0.0032                    | 0.0053                      | 5.9506          | 0.0031                         | 0.0063                           | 2     |
| 5.9626     | 0.0021                    | 0.0021                      | 5.9506          | 0.0031                         | 0.0031                           | 3     |
| 5.9599     | 0.0042                    | 0.0                         | 5.9506          | 0.0                            | 0.0                              | 4     |
| 5.9718     | 0.0                       | 0.0011                      | 5.9506          | 0.0                            | 0.0031                           | 5     |
| 5.9587     | 0.0021                    | 0.0064                      | 5.9506          | 0.0031                         | 0.0031                           | 6     |
| 5.9657     | 0.0064                    | 0.0032                      | 5.9506          | 0.0031                         | 0.0188                           | 7     |
| 5.9617     | 0.0021                    | 0.0032                      | 5.9506          | 0.0031                         | 0.0063                           | 8     |
| 5.9596     | 0.0021                    | 0.0032                      | 5.9506          | 0.0                            | 0.0031                           | 9     |
| 5.9648     | 0.0021                    | 0.0021                      | 5.9506          | 0.0094                         | 0.0063                           | 10    |
| 5.9608     | 0.0021                    | 0.0032                      | 5.9506          | 0.0125                         | 0.0094                           | 11    |
| 5.9567     | 0.0021                    | 0.0053                      | 5.9506          | 0.0063                         | 0.0                              | 12    |
| 5.9625     | 0.0011                    | 0.0011                      | 5.9506          | 0.0                            | 0.0                              | 13    |
| 5.9640     | 0.0                       | 0.0011                      | 5.9506          | 0.0031                         | 0.0                              | 14    |
| 5.9606     | 0.0011                    | 0.0                         | 5.9506          | 0.0063                         | 0.0063                           | 15    |
| 5.9622     | 0.0032                    | 0.0053                      | 5.9506          | 0.0094                         | 0.0063                           | 16    |
| 5.9600     | 0.0011                    | 0.0021                      | 5.9506          | 0.0                            | 0.0063                           | 17    |
| 5.9579     | 0.0011                    | 0.0011                      | 5.9506          | 0.0063                         | 0.0094                           | 18    |
| 5.9598     | 0.0032                    | 0.0053                      | 5.9506          | 0.0031                         | 0.0                              | 19    |
| 5.9589     | 0.0021                    | 0.0032                      | 5.9506          | 0.0063                         | 0.0031                           | 20    |
| 5.9566     | 0.0032                    | 0.0021                      | 5.9506          | 0.0                            | 0.0                              | 21    |
| 5.9536     | 0.0011                    | 0.0053                      | 5.9506          | 0.0                            | 0.0                              | 22    |
| 5.9592     | 0.0021                    | 0.0021                      | 5.9506          | 0.0031                         | 0.0031                           | 23    |
| 5.9548     | 0.0032                    | 0.0042                      | 5.9506          | 0.0                            | 0.0                              | 24    |
| 5.9569     | 0.0                       | 0.0021                      | 5.9506          | 0.0                            | 0.0                              | 25    |
| 5.9640     | 0.0032                    | 0.0011                      | 5.9506          | 0.0031                         | 0.0031                           | 26    |
| 5.9497     | 0.0011                    | 0.0011                      | 5.9506          | 0.0                            | 0.0031                           | 27    |
| 5.9558     | 0.0                       | 0.0053                      | 5.9506          | 0.0063                         | 0.0031                           | 28    |
| 5.9563     | 0.0021                    | 0.0032                      | 5.9506          | 0.0063                         | 0.0063                           | 29    |
| 5.9585     | 0.0032                    | 0.0032                      | 5.9506          | 0.0                            | 0.0094                           | 30    |
| 5.9569     | 0.0011                    | 0.0021                      | 5.9506          | 0.0094                         | 0.0063                           | 31    |
| 5.9580     | 0.0011                    | 0.0021                      | 5.9506          | 0.0063                         | 0.0                              | 32    |
| 5.9532     | 0.0032                    | 0.0011                      | 5.9506          | 0.0                            | 0.0063                           | 33    |
| 5.9523     | 0.0021                    | 0.0032                      | 5.9506          | 0.0                            | 0.0                              | 34    |
| 5.9552     | 0.0042                    | 0.0011                      | 5.9506          | 0.0                            | 0.0                              | 35    |
| 5.9538     | 0.0021                    | 0.0032                      | 5.9506          | 0.0                            | 0.0                              | 36    |
| 5.9538     | 0.0032                    | 0.0032                      | 5.9506          | 0.0031                         | 0.0063                           | 37    |
| 5.9567     | 0.0011                    | 0.0021                      | 5.9506          | 0.0063                         | 0.0031                           | 38    |
| 5.9570     | 0.0053                    | 0.0032                      | 5.9506          | 0.0                            | 0.0031                           | 39    |
| 5.9545     | 0.0032                    | 0.0                         | 5.9506          | 0.0                            | 0.0063                           | 40    |


### Framework versions

- Transformers 4.41.2
- TensorFlow 2.15.0
- Datasets 2.20.0
- Tokenizers 0.19.1