Edit model card

shahajbert_nwp_finetuning_def_v1

This model is a fine-tuned version of neuropark/sahajBERT on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.3686

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss
3.4643 1.0 1294 3.4986
3.3773 2.0 2588 3.4239
3.3263 3.0 3882 3.4514
3.253 4.0 5176 3.4217
3.1767 5.0 6470 3.3953
3.1382 6.0 7764 3.3723
3.0959 7.0 9058 3.3533
3.0717 8.0 10352 3.3841
3.0472 9.0 11646 3.3747
3.003 10.0 12940 3.4074
2.9259 11.0 14234 3.3752
2.9641 12.0 15528 3.4120
2.9016 13.0 16822 3.3769
2.9102 14.0 18116 3.4494
2.8359 15.0 19410 3.3717
2.8327 16.0 20704 3.3801
2.7914 17.0 21998 3.4244
2.776 18.0 23292 3.3533
2.7258 19.0 24586 3.3669
2.7304 20.0 25880 3.3113
2.6619 21.0 27174 3.3531
2.6721 22.0 28468 3.3728
2.6387 23.0 29762 3.4085
2.6302 24.0 31056 3.3909
2.5952 25.0 32350 3.3288
2.5854 26.0 33644 3.3465
2.5597 27.0 34938 3.3854
2.5667 28.0 36232 3.3977
2.5189 29.0 37526 3.4447
2.5253 30.0 38820 3.4329
2.5013 31.0 40114 3.3689
2.4165 32.0 41408 3.4354
2.4414 33.0 42702 3.4247
2.4435 34.0 43996 3.4555
2.3846 35.0 45290 3.3626
2.3611 36.0 46584 3.4496
2.3753 37.0 47878 3.4788
2.3526 38.0 49172 3.4418
2.3543 39.0 50466 3.4803
2.3279 40.0 51760 3.4240
2.3388 41.0 53054 3.4378
2.3182 42.0 54348 3.4351
2.3118 43.0 55642 3.4245
2.3033 44.0 56936 3.4708
2.2757 45.0 58230 3.4062
2.2733 46.0 59524 3.5356
2.2746 47.0 60818 3.4078
2.2683 48.0 62112 3.4636
2.2498 49.0 63406 3.4752
2.2418 50.0 64700 3.4336

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
1
Safetensors
Model size
17.1M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Hamza11/shahajbert_nwp_finetuning_def_v1

Finetuned
(2)
this model