Edit model card

hBERTv1_new_pretrain_48_emb_com_stsb

This model is a fine-tuned version of gokuls/bert_12_layer_model_v1_complete_training_new_emb_compress_48 on the GLUE STSB dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9214
  • Pearson: 0.4648
  • Spearmanr: 0.4600
  • Combined Score: 0.4624

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 10
  • distributed_type: multi-GPU
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Pearson Spearmanr Combined Score
2.5817 1.0 45 2.6028 0.2027 0.1896 0.1962
2.1023 2.0 90 2.1596 0.2035 0.1938 0.1986
1.9567 3.0 135 2.3409 0.1855 0.1931 0.1893
1.7201 4.0 180 2.1790 0.2865 0.2934 0.2899
1.5153 5.0 225 2.1208 0.3381 0.3352 0.3367
1.2674 6.0 270 2.1224 0.3882 0.3898 0.3890
1.0115 7.0 315 2.2253 0.4304 0.4281 0.4293
0.7449 8.0 360 2.3235 0.4236 0.4323 0.4279
0.66 9.0 405 2.3617 0.4340 0.4351 0.4346
0.4678 10.0 450 2.0741 0.4300 0.4258 0.4279
0.4438 11.0 495 2.3816 0.4285 0.4294 0.4289
0.3192 12.0 540 2.1673 0.4580 0.4602 0.4591
0.2481 13.0 585 2.1544 0.4392 0.4357 0.4374
0.2296 14.0 630 2.0075 0.4603 0.4582 0.4593
0.1765 15.0 675 2.1395 0.4624 0.4617 0.4621
0.1533 16.0 720 2.2715 0.4512 0.4427 0.4469
0.1343 17.0 765 2.1726 0.4441 0.4417 0.4429
0.1373 18.0 810 2.0223 0.4532 0.4424 0.4478
0.1277 19.0 855 1.9992 0.4395 0.4299 0.4347
0.0968 20.0 900 2.1078 0.4620 0.4601 0.4610
0.084 21.0 945 2.0684 0.4627 0.4577 0.4602
0.0777 22.0 990 1.9214 0.4648 0.4600 0.4624
0.0572 23.0 1035 2.0636 0.4506 0.4422 0.4464
0.0615 24.0 1080 2.0404 0.4489 0.4388 0.4438
0.0516 25.0 1125 2.0599 0.4516 0.4435 0.4475
0.0501 26.0 1170 2.0359 0.4530 0.4489 0.4510
0.0515 27.0 1215 1.9571 0.4588 0.4508 0.4548

Framework versions

  • Transformers 4.30.2
  • Pytorch 1.14.0a0+410ce96
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train gokuls/hBERTv1_new_pretrain_48_emb_com_stsb

Evaluation results