PhoBert_content_256

This model is a fine-tuned version of vinai/phobert-base-v2 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5561
  • Accuracy: 0.8962
  • F1: 0.8869

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.4673 200 0.3165 0.8774 0.8648
No log 0.9346 400 0.2647 0.8947 0.8875
0.3125 1.4019 600 0.2729 0.8947 0.8861
0.3125 1.8692 800 0.2917 0.8981 0.8885
0.2303 2.3364 1000 0.2730 0.8976 0.8893
0.2303 2.8037 1200 0.2565 0.9026 0.8958
0.1856 3.2710 1400 0.3230 0.8929 0.8831
0.1856 3.7383 1600 0.3062 0.8988 0.8901
0.143 4.2056 1800 0.3220 0.8997 0.8926
0.143 4.6729 2000 0.3811 0.8926 0.8819
0.1124 5.1402 2200 0.4578 0.8900 0.8785
0.1124 5.6075 2400 0.4192 0.8935 0.8835
0.0889 6.0748 2600 0.4115 0.8986 0.8910
0.0889 6.5421 2800 0.4469 0.8959 0.8883
0.072 7.0093 3000 0.4771 0.8912 0.8804
0.072 7.4766 3200 0.4829 0.8947 0.8855
0.072 7.9439 3400 0.4922 0.8943 0.8850
0.0559 8.4112 3600 0.5391 0.8941 0.8840
0.0559 8.8785 3800 0.5356 0.8941 0.8848
0.0451 9.3458 4000 0.5583 0.8943 0.8848
0.0451 9.8131 4200 0.5561 0.8962 0.8869

Framework versions

  • Transformers 4.51.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.3.1
  • Tokenizers 0.21.0
Downloads last month
61
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for RonTon05/PhoBert_content_256

Finetuned
(213)
this model