CodeBertaCLM / README.md
librarian-bot's picture
Librarian Bot: Add base_model information to model
503972b
|
raw
history blame
4.19 kB
metadata
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
base_model: microsoft/codebert-base
model-index:
  - name: CodeBertaCLM
    results: []

CodeBertaCLM

This model is a fine-tuned version of microsoft/codebert-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5831
  • Accuracy: 0.0144
  • F1: 0.0144
  • Bleu4: 0.0421

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Bleu4
3.6734 1.0 1673 3.6884 0.0159 0.0159 0.0131
2.8139 2.0 3346 3.2517 0.0164 0.0164 0.0192
2.4176 3.0 5019 3.0747 0.0178 0.0178 0.0332
2.2785 4.0 6692 2.9695 0.0174 0.0174 0.0347
2.1557 5.0 8365 2.8886 0.0171 0.0171 0.0377
2.0357 6.0 10038 2.8313 0.0158 0.0158 0.0394
1.9615 7.0 11711 2.7865 0.0158 0.0158 0.0393
1.8982 8.0 13384 2.7498 0.0147 0.0147 0.0399
1.8233 9.0 15057 2.7195 0.0149 0.0149 0.0430
1.7866 10.0 16730 2.6925 0.0157 0.0157 0.0485
1.7237 11.0 18403 2.6745 0.0146 0.0146 0.0419
1.6757 12.0 20076 2.6616 0.0146 0.0146 0.0403
1.6452 13.0 21749 2.6377 0.0147 0.0147 0.0403
1.6036 14.0 23422 2.6216 0.0145 0.0145 0.0397
1.5818 15.0 25095 2.6169 0.0150 0.0150 0.0413
1.5389 16.0 26768 2.6047 0.0146 0.0146 0.0420
1.5131 17.0 28441 2.5940 0.0153 0.0153 0.0433
1.4822 18.0 30114 2.5899 0.0145 0.0145 0.0404
1.4461 19.0 31787 2.5812 0.0150 0.0150 0.0423
1.4149 20.0 33460 2.5841 0.0148 0.0148 0.0418
1.3933 21.0 35133 2.5783 0.0139 0.0139 0.0386
1.3752 22.0 36806 2.5730 0.0151 0.0151 0.0444
1.3412 23.0 38479 2.5709 0.0149 0.0149 0.0419
1.3307 24.0 40152 2.5699 0.0143 0.0143 0.0424
1.2909 25.0 41825 2.5648 0.0144 0.0144 0.0416
1.2679 26.0 43498 2.5615 0.0145 0.0145 0.0420
1.2603 27.0 45171 2.5626 0.0148 0.0148 0.0433
1.2203 28.0 46844 2.5670 0.0148 0.0148 0.0410
1.2134 29.0 48517 2.5536 0.0147 0.0147 0.0422
1.1907 30.0 50190 2.5701 0.0139 0.0139 0.0404
1.1702 31.0 51863 2.5722 0.0143 0.0143 0.0424
1.1555 32.0 53536 2.5679 0.0144 0.0144 0.0434
1.1371 33.0 55209 2.5694 0.0146 0.0146 0.0431
1.1189 34.0 56882 2.5692 0.0141 0.0141 0.0422
1.0989 35.0 58555 2.5831 0.0144 0.0144 0.0421

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.0+cu117
  • Datasets 2.7.1
  • Tokenizers 0.13.2