metadata
library_name: transformers
license: apache-2.0
base_model: indolem/indobertweet-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: PonXXI
results: []
PonXXI
This model is a fine-tuned version of indolem/indobertweet-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.3259
- Accuracy: 0.7457
- Precision: 0.7431
- Recall: 0.7429
- F1: 0.7422
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
0.8207 | 1.0 | 214 | 0.7313 | 0.6928 | 0.7083 | 0.6928 | 0.6937 |
0.6159 | 2.0 | 428 | 0.6846 | 0.7270 | 0.7421 | 0.7305 | 0.7250 |
0.4666 | 3.0 | 642 | 0.7258 | 0.7270 | 0.7282 | 0.7243 | 0.7221 |
0.349 | 4.0 | 856 | 0.8328 | 0.7406 | 0.7403 | 0.7368 | 0.7356 |
0.2752 | 5.0 | 1070 | 0.8500 | 0.7406 | 0.7377 | 0.7387 | 0.7379 |
0.224 | 6.0 | 1284 | 1.0037 | 0.7457 | 0.7425 | 0.7435 | 0.7424 |
0.1883 | 7.0 | 1498 | 1.1039 | 0.7457 | 0.7446 | 0.7435 | 0.7437 |
0.1518 | 8.0 | 1712 | 1.1535 | 0.7457 | 0.7439 | 0.7426 | 0.7420 |
0.1372 | 9.0 | 1926 | 1.3070 | 0.7304 | 0.7273 | 0.7262 | 0.7241 |
0.1195 | 10.0 | 2140 | 1.3259 | 0.7457 | 0.7431 | 0.7429 | 0.7422 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.1
- Tokenizers 0.19.1