metadata
license: apache-2.0
base_model: bert-base-multilingual-cased
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: FrozenLAST-BERT-multilingual-finetuned-CEFR_ner-3000news
results: []
FrozenLAST-BERT-multilingual-finetuned-CEFR_ner-3000news
This model is a fine-tuned version of bert-base-multilingual-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5231
- Accuracy: 0.4029
- Precision: 0.4730
- Recall: 0.6519
- F1: 0.4491
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 12
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
No log | 1.0 | 132 | 0.6719 | 0.3466 | 0.5010 | 0.4385 | 0.3075 |
No log | 2.0 | 264 | 0.5498 | 0.3741 | 0.5375 | 0.5142 | 0.3600 |
No log | 3.0 | 396 | 0.4966 | 0.3872 | 0.4761 | 0.5740 | 0.3970 |
0.6209 | 4.0 | 528 | 0.4753 | 0.3937 | 0.4654 | 0.5910 | 0.4146 |
0.6209 | 5.0 | 660 | 0.4714 | 0.3952 | 0.4558 | 0.6100 | 0.4200 |
0.6209 | 6.0 | 792 | 0.4684 | 0.3989 | 0.4640 | 0.6139 | 0.4275 |
0.6209 | 7.0 | 924 | 0.4813 | 0.3994 | 0.4673 | 0.6276 | 0.4376 |
0.2738 | 8.0 | 1056 | 0.4890 | 0.4015 | 0.4760 | 0.6404 | 0.4470 |
0.2738 | 9.0 | 1188 | 0.5157 | 0.4013 | 0.4723 | 0.6386 | 0.4432 |
0.2738 | 10.0 | 1320 | 0.5229 | 0.4019 | 0.4665 | 0.6498 | 0.4433 |
0.2738 | 11.0 | 1452 | 0.5188 | 0.4024 | 0.4673 | 0.6529 | 0.4448 |
0.1745 | 12.0 | 1584 | 0.5231 | 0.4029 | 0.4730 | 0.6519 | 0.4491 |
Framework versions
- Transformers 4.41.1
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1