File size: 6,344 Bytes
fab33af
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0189c1a
 
 
 
 
 
 
fab33af
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
325b99f
29670e9
a59d310
619c163
08e7799
97a071f
f12ea6d
e1c6c8a
ab9f58c
67b42d7
2cfcdfb
338aa6f
f88f52e
b57901c
b5767c7
287d9da
616cc95
24c0d4c
29c414a
a44396d
22e6c0e
f9a94a3
 
0189c1a
fab33af
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: edyfjm07/distilbert-base-uncased-QA2-finetuned-squad-es
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# edyfjm07/distilbert-base-uncased-QA2-finetuned-squad-es

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0399
- Train End Logits Accuracy: 0.9821
- Train Start Logits Accuracy: 0.9884
- Validation Loss: 1.5254
- Validation End Logits Accuracy: 0.7868
- Validation Start Logits Accuracy: 0.7931
- Epoch: 24

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 0.0001, 'decay_steps': 5474, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch |
|:----------:|:-------------------------:|:---------------------------:|:---------------:|:------------------------------:|:--------------------------------:|:-----:|
| 2.3428     | 0.4160                    | 0.4317                      | 1.3438          | 0.5611                         | 0.6458                           | 0     |
| 1.1526     | 0.6261                    | 0.6397                      | 1.0597          | 0.6677                         | 0.7429                           | 1     |
| 0.7612     | 0.7269                    | 0.7647                      | 1.0245          | 0.7210                         | 0.7806                           | 2     |
| 0.5528     | 0.7836                    | 0.8319                      | 1.2436          | 0.7116                         | 0.7712                           | 3     |
| 0.4667     | 0.8340                    | 0.8435                      | 1.0705          | 0.7524                         | 0.7555                           | 4     |
| 0.3834     | 0.8813                    | 0.8687                      | 1.1209          | 0.7586                         | 0.7712                           | 5     |
| 0.3678     | 0.8634                    | 0.8876                      | 1.2341          | 0.7618                         | 0.7649                           | 6     |
| 0.2555     | 0.9044                    | 0.9181                      | 1.1561          | 0.7649                         | 0.8056                           | 7     |
| 0.2151     | 0.9160                    | 0.9328                      | 1.0908          | 0.7931                         | 0.7994                           | 8     |
| 0.1855     | 0.9286                    | 0.9475                      | 1.2809          | 0.7994                         | 0.7774                           | 9     |
| 0.1654     | 0.9443                    | 0.9454                      | 1.3974          | 0.7837                         | 0.7806                           | 10    |
| 0.1282     | 0.9464                    | 0.9517                      | 1.4260          | 0.7774                         | 0.7837                           | 11    |
| 0.1313     | 0.9443                    | 0.9601                      | 1.4537          | 0.7900                         | 0.7962                           | 12    |
| 0.1301     | 0.9517                    | 0.9590                      | 1.1851          | 0.7774                         | 0.8150                           | 13    |
| 0.1089     | 0.9548                    | 0.9590                      | 1.2442          | 0.7774                         | 0.8088                           | 14    |
| 0.1023     | 0.9601                    | 0.9622                      | 1.4575          | 0.7931                         | 0.7931                           | 15    |
| 0.0956     | 0.9590                    | 0.9685                      | 1.5160          | 0.7837                         | 0.7900                           | 16    |
| 0.0712     | 0.9727                    | 0.9737                      | 1.5741          | 0.7900                         | 0.8088                           | 17    |
| 0.0752     | 0.9674                    | 0.9790                      | 1.4401          | 0.7931                         | 0.7994                           | 18    |
| 0.0604     | 0.9737                    | 0.9779                      | 1.6410          | 0.7962                         | 0.8088                           | 19    |
| 0.0497     | 0.9758                    | 0.9821                      | 1.5655          | 0.7962                         | 0.8119                           | 20    |
| 0.0668     | 0.9685                    | 0.9811                      | 1.3480          | 0.7806                         | 0.7962                           | 21    |
| 0.0567     | 0.9769                    | 0.9800                      | 1.3820          | 0.7900                         | 0.8088                           | 22    |
| 0.0550     | 0.9769                    | 0.9832                      | 1.3593          | 0.7806                         | 0.8056                           | 23    |
| 0.0399     | 0.9821                    | 0.9884                      | 1.5254          | 0.7868                         | 0.7931                           | 24    |


### Framework versions

- Transformers 4.41.2
- TensorFlow 2.15.0
- Datasets 2.20.0
- Tokenizers 0.19.1