Edit model card

Bert_uncased_fine_tuned_Reward_Model

This model is a fine-tuned version of bert-base-uncased on the poem_sentiment dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0876
  • Mse: 0.0876
  • Mae: 0.1403
  • R2: 0.7389
  • Accuracy: 0.875

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Mse Mae R2 Accuracy
No log 1.0 53 0.1744 0.1744 0.2973 0.4805 0.7885
No log 2.0 106 0.1074 0.1074 0.2333 0.6801 0.8846
No log 3.0 159 0.1026 0.1026 0.2134 0.6943 0.8654
No log 4.0 212 0.0877 0.0877 0.1841 0.7388 0.8942
No log 5.0 265 0.1000 0.1000 0.2007 0.7021 0.8942
No log 6.0 318 0.0863 0.0863 0.1738 0.7429 0.8942
No log 7.0 371 0.0966 0.0966 0.1827 0.7122 0.8846
No log 8.0 424 0.0946 0.0946 0.1701 0.7183 0.8846
No log 9.0 477 0.0978 0.0978 0.1658 0.7088 0.875
0.0516 10.0 530 0.0854 0.0854 0.1639 0.7457 0.875
0.0516 11.0 583 0.0947 0.0947 0.1620 0.7181 0.8846
0.0516 12.0 636 0.0907 0.0907 0.1516 0.7297 0.8846
0.0516 13.0 689 0.0885 0.0885 0.1546 0.7364 0.875
0.0516 14.0 742 0.0849 0.0849 0.1452 0.7471 0.8942
0.0516 15.0 795 0.0823 0.0823 0.1428 0.7548 0.8846
0.0516 16.0 848 0.0864 0.0864 0.1429 0.7427 0.8846
0.0516 17.0 901 0.0854 0.0854 0.1427 0.7457 0.8846
0.0516 18.0 954 0.0860 0.0860 0.1429 0.7437 0.875
0.0059 19.0 1007 0.0871 0.0871 0.1438 0.7406 0.875
0.0059 20.0 1060 0.0876 0.0876 0.1403 0.7389 0.875

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1+cu116
  • Datasets 2.10.1
  • Tokenizers 0.13.2
Downloads last month
13

Dataset used to train AliChazz/Bert_uncased_fine_tuned_Reward_Model

Evaluation results