Edit model card

ckt3

This model is a fine-tuned version of samhitmantrala/ckt2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3129

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 1 0.6850
No log 2.0 2 0.6271
No log 3.0 3 0.5934
No log 4.0 4 0.5708
No log 5.0 5 0.5513
No log 6.0 6 0.5337
No log 7.0 7 0.5165
No log 8.0 8 0.5004
No log 9.0 9 0.4860
No log 10.0 10 0.4734
No log 11.0 11 0.4634
No log 12.0 12 0.4543
No log 13.0 13 0.4460
No log 14.0 14 0.4381
No log 15.0 15 0.4299
No log 16.0 16 0.4220
No log 17.0 17 0.4141
No log 18.0 18 0.4068
No log 19.0 19 0.4000
No log 20.0 20 0.3933
No log 21.0 21 0.3871
No log 22.0 22 0.3817
No log 23.0 23 0.3766
No log 24.0 24 0.3718
No log 25.0 25 0.3672
No log 26.0 26 0.3625
No log 27.0 27 0.3581
No log 28.0 28 0.3537
No log 29.0 29 0.3493
No log 30.0 30 0.3453
No log 31.0 31 0.3414
No log 32.0 32 0.3380
No log 33.0 33 0.3348
No log 34.0 34 0.3317
No log 35.0 35 0.3288
No log 36.0 36 0.3262
No log 37.0 37 0.3241
No log 38.0 38 0.3221
No log 39.0 39 0.3205
No log 40.0 40 0.3191
No log 41.0 41 0.3179
No log 42.0 42 0.3167
No log 43.0 43 0.3158
No log 44.0 44 0.3150
No log 45.0 45 0.3143
No log 46.0 46 0.3138
No log 47.0 47 0.3134
No log 48.0 48 0.3132
No log 49.0 49 0.3130
No log 50.0 50 0.3129

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
1
Safetensors
Model size
81.9M params
Tensor type
F32
·

Finetuned from