CommitPredictor / README.md
mamiksik's picture
update model card README.md
d1c9a23
metadata
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: CommitPredictor
    results: []

CommitPredictor

This model is a fine-tuned version of microsoft/codebert-base-mlm on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9935
  • Accuracy: 0.6325

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 21
  • eval_batch_size: 21
  • seed: 42
  • gradient_accumulation_steps: 3
  • total_train_batch_size: 63
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 448 2.4744 0.5376
2.9007 2.0 896 2.4149 0.5473
2.5284 3.0 1344 2.3077 0.5639
2.3292 4.0 1792 2.2617 0.5640
2.2692 5.0 2240 2.2155 0.5719
2.1766 6.0 2688 2.1555 0.5792
2.0842 7.0 3136 2.0758 0.6030
2.0268 8.0 3584 2.1446 0.5942
1.9416 9.0 4032 2.1110 0.5840
1.9416 10.0 4480 2.1379 0.5888
1.8969 11.0 4928 2.0461 0.6082
1.8247 12.0 5376 2.0585 0.6007
1.8038 13.0 5824 2.0541 0.6022
1.7601 14.0 6272 2.0832 0.6043
1.7086 15.0 6720 2.0224 0.6096
1.7087 16.0 7168 2.0853 0.6057
1.653 17.0 7616 2.0259 0.6124
1.5953 18.0 8064 1.9913 0.6207
1.6074 19.0 8512 1.9798 0.6157
1.6074 20.0 8960 2.0234 0.6033
1.5749 21.0 9408 1.9686 0.6197
1.535 22.0 9856 2.0068 0.6163
1.4942 23.0 10304 1.9486 0.6310
1.4765 24.0 10752 1.9502 0.6304
1.4558 25.0 11200 1.9509 0.6328
1.4617 26.0 11648 1.9903 0.6196
1.4224 27.0 12096 1.9849 0.6321
1.4019 28.0 12544 1.9781 0.6193
1.4019 29.0 12992 2.0661 0.6145
1.3624 30.0 13440 1.9948 0.6191
1.3517 31.0 13888 1.9117 0.6392
1.3613 32.0 14336 2.0300 0.6176
1.3428 33.0 14784 2.0005 0.6226
1.3257 34.0 15232 2.0079 0.6149
1.3127 35.0 15680 2.0231 0.6213
1.289 36.0 16128 1.9961 0.6296
1.2689 37.0 16576 1.9930 0.6221
1.2651 38.0 17024 1.9675 0.6314
1.2651 39.0 17472 1.9835 0.6220
1.2638 40.0 17920 nan 0.6275
1.235 41.0 18368 2.0100 0.6299
1.2239 42.0 18816 2.0384 0.6152
1.2147 43.0 19264 2.0421 0.6209
1.1961 44.0 19712 2.0041 0.6212
1.1988 45.0 20160 1.9905 0.6230
1.2007 46.0 20608 2.0222 0.6275
1.2029 47.0 21056 1.9856 0.6361
1.1779 48.0 21504 2.0348 0.6184
1.1779 49.0 21952 1.9196 0.6324
1.1973 50.0 22400 1.9935 0.6325

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.0+cu117
  • Datasets 2.7.1
  • Tokenizers 0.13.2