mamiksik's picture
update model card README.md
d1972f2
|
raw
history blame
3.43 kB
metadata
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: CodeBERTa-commit-message-autocomplete
    results: []

CodeBERTa-commit-message-autocomplete

This model is a fine-tuned version of microsoft/codebert-base-mlm on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8906
  • Accuracy: 0.6346

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 1024
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 40 4.5523 0.3432
No log 2.0 80 3.8711 0.3796
No log 3.0 120 3.2419 0.4503
No log 4.0 160 2.8709 0.4962
No log 5.0 200 2.6999 0.5085
No log 6.0 240 2.6622 0.5216
No log 7.0 280 2.5048 0.5410
No log 8.0 320 2.4249 0.5581
No log 9.0 360 2.3727 0.5623
No log 10.0 400 2.3625 0.5665
No log 11.0 440 2.3320 0.5706
No log 12.0 480 2.1704 0.5950
3.081 13.0 520 2.2109 0.5893
3.081 14.0 560 2.2330 0.5884
3.081 15.0 600 2.1454 0.5954
3.081 16.0 640 2.1740 0.5951
3.081 17.0 680 2.1219 0.5920
3.081 18.0 720 2.1136 0.6052
3.081 19.0 760 2.0586 0.6127
3.081 20.0 800 2.0185 0.6113
3.081 21.0 840 2.0493 0.6129
3.081 22.0 880 1.9766 0.6217
3.081 23.0 920 1.9968 0.6189
3.081 24.0 960 1.9567 0.6276
2.122 25.0 1000 1.9611 0.6269
2.122 26.0 1040 1.9437 0.6254
2.122 27.0 1080 1.9865 0.6266
2.122 28.0 1120 1.9112 0.6295
2.122 29.0 1160 1.8903 0.6292
2.122 30.0 1200 1.8992 0.6376
2.122 31.0 1240 1.9122 0.6327
2.122 32.0 1280 1.8906 0.6346

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.0+cu117
  • Datasets 2.7.1
  • Tokenizers 0.13.2