Edit model card

wdevinsp/indobert-base-uncased-finetuned-digestive-qna

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0391
  • Train End Logits Accuracy: 0.9814
  • Train Start Logits Accuracy: 0.9814
  • Epoch: 37

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 3700, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Train End Logits Accuracy Train Start Logits Accuracy Epoch
3.1872 0.3564 0.3733 0
2.1076 0.4189 0.4088 1
1.6782 0.4443 0.5507 2
1.1711 0.5507 0.7111 3
0.6231 0.7720 0.8429 4
0.4296 0.8463 0.8902 5
0.2544 0.9189 0.9409 6
0.1853 0.9358 0.9510 7
0.1411 0.9544 0.9578 8
0.1215 0.9493 0.9662 9
0.1141 0.9645 0.9628 10
0.1195 0.9561 0.9696 11
0.0851 0.9662 0.9645 12
0.1255 0.9527 0.9493 13
0.0897 0.9595 0.9730 14
0.0860 0.9578 0.9696 15
0.0710 0.9595 0.9713 16
0.0669 0.9628 0.9713 17
0.0634 0.9797 0.9730 18
0.0765 0.9662 0.9814 19
0.0732 0.9679 0.9730 20
0.0586 0.9696 0.9713 21
0.0572 0.9679 0.9764 22
0.0518 0.9747 0.9764 23
0.0500 0.9713 0.9764 24
0.0464 0.9696 0.9713 25
0.0470 0.9814 0.9730 26
0.0575 0.9797 0.9764 27
0.0660 0.9679 0.9696 28
0.0576 0.9713 0.9696 29
0.0515 0.9696 0.9747 30
0.0547 0.9780 0.9696 31
0.0430 0.9730 0.9780 32
0.0408 0.9814 0.9747 33
0.0475 0.9747 0.9747 34
0.0536 0.9764 0.9797 35
0.0494 0.9696 0.9730 36
0.0391 0.9814 0.9814 37

Framework versions

  • Transformers 4.41.2
  • TensorFlow 2.15.0
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
23

Finetuned from