Edit model card

hkpr_cricket_001

This model is a fine-tuned version of samhitmantrala/cricket3 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6178

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-06
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 1 2.6288
No log 2.0 2 2.6279
No log 3.0 3 2.6272
No log 4.0 4 2.6264
No log 5.0 5 2.6258
No log 6.0 6 2.6251
No log 7.0 7 2.6245
No log 8.0 8 2.6240
No log 9.0 9 2.6236
No log 10.0 10 2.6231
No log 11.0 11 2.6226
No log 12.0 12 2.6222
No log 13.0 13 2.6217
No log 14.0 14 2.6214
No log 15.0 15 2.6211
No log 16.0 16 2.6210
No log 17.0 17 2.6207
No log 18.0 18 2.6206
No log 19.0 19 2.6204
No log 20.0 20 2.6202
No log 21.0 21 2.6200
No log 22.0 22 2.6198
No log 23.0 23 2.6195
No log 24.0 24 2.6193
No log 25.0 25 2.6190
No log 26.0 26 2.6189
No log 27.0 27 2.6188
No log 28.0 28 2.6186
No log 29.0 29 2.6185
No log 30.0 30 2.6185
No log 31.0 31 2.6183
No log 32.0 32 2.6182
No log 33.0 33 2.6182
No log 34.0 34 2.6181
No log 35.0 35 2.6180
No log 36.0 36 2.6180
No log 37.0 37 2.6180
No log 38.0 38 2.6180
No log 39.0 39 2.6180
No log 40.0 40 2.6180
No log 41.0 41 2.6180
No log 42.0 42 2.6179
No log 43.0 43 2.6179
No log 44.0 44 2.6179
No log 45.0 45 2.6179
No log 46.0 46 2.6179
No log 47.0 47 2.6178
No log 48.0 48 2.6178
No log 49.0 49 2.6178
No log 50.0 50 2.6178

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
3
Safetensors
Model size
81.9M params
Tensor type
F32
·

Finetuned from