metadata
license: apache-2.0
base_model: bert-base-multilingual-cased
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: spa-eng-pos-tagging-v1.3
results: []
spa-eng-pos-tagging-v1.3
This model is a fine-tuned version of bert-base-multilingual-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1650
- Accuracy: 0.9471
- Precision: 0.9372
- Recall: 0.8815
- F1: 0.8779
- Hamming Loss: 0.0529
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 16
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Hamming Loss |
---|---|---|---|---|---|---|---|---|
0.3809 | 1.0 | 1744 | 0.2945 | 0.8919 | 0.8798 | 0.8290 | 0.8221 | 0.1081 |
0.2625 | 2.0 | 3488 | 0.2725 | 0.8975 | 0.9004 | 0.8279 | 0.8319 | 0.1025 |
0.1918 | 3.0 | 5232 | 0.1901 | 0.9317 | 0.9224 | 0.8645 | 0.8618 | 0.0683 |
0.1674 | 4.0 | 6976 | 0.1780 | 0.9369 | 0.9319 | 0.8695 | 0.8694 | 0.0631 |
0.1478 | 5.0 | 8720 | 0.1816 | 0.9385 | 0.9303 | 0.8735 | 0.8697 | 0.0615 |
0.1201 | 6.0 | 10464 | 0.1650 | 0.9471 | 0.9372 | 0.8815 | 0.8779 | 0.0529 |
0.096 | 7.0 | 12208 | 0.1663 | 0.9493 | 0.9390 | 0.8851 | 0.8806 | 0.0507 |
0.0844 | 8.0 | 13952 | 0.1715 | 0.9500 | 0.9421 | 0.8838 | 0.8815 | 0.0500 |
0.0687 | 9.0 | 15696 | 0.1877 | 0.9502 | 0.9433 | 0.8816 | 0.8811 | 0.0498 |
0.0573 | 10.0 | 17440 | 0.1949 | 0.9483 | 0.9444 | 0.8781 | 0.8799 | 0.0517 |
0.0533 | 11.0 | 19184 | 0.1960 | 0.9544 | 0.9450 | 0.8872 | 0.8847 | 0.0456 |
0.0399 | 12.0 | 20928 | 0.2012 | 0.9565 | 0.9494 | 0.8884 | 0.8876 | 0.0435 |
0.031 | 13.0 | 22672 | 0.2119 | 0.9571 | 0.9496 | 0.8889 | 0.8879 | 0.0429 |
0.0292 | 14.0 | 24416 | 0.2213 | 0.9587 | 0.9512 | 0.8906 | 0.8896 | 0.0413 |
0.024 | 15.0 | 26160 | 0.2274 | 0.9587 | 0.9517 | 0.8899 | 0.8895 | 0.0413 |
0.0198 | 16.0 | 27904 | 0.2314 | 0.9591 | 0.8894 | 0.8905 | 0.8899 | 0.0409 |
Framework versions
- Transformers 4.32.0
- Pytorch 2.0.1+cu118
- Tokenizers 0.13.3