ConPlag Experiments
Collection
48 items
โข
Updated
This model is a fine-tuned version of huggingface/CodeBERTa-small-v1 on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Accuracy | Recall | Precision | F1 | F Beta Score |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 40 | 0.6247 | 0.7372 | 0.6905 | 0.5577 | 0.6170 | 0.6433 |
No log | 2.0 | 80 | 0.5205 | 0.7007 | 0.8333 | 0.5072 | 0.6306 | 0.6957 |
0.6216 | 3.0 | 120 | 0.4895 | 0.8467 | 0.5952 | 0.8621 | 0.7042 | 0.6579 |
0.6216 | 4.0 | 160 | 0.4377 | 0.8613 | 0.6667 | 0.8485 | 0.7467 | 0.7137 |
0.3445 | 5.0 | 200 | 0.4008 | 0.8759 | 0.8095 | 0.7907 | 0.8 | 0.8036 |
0.3445 | 6.0 | 240 | 0.3619 | 0.9051 | 0.8095 | 0.8718 | 0.8395 | 0.8277 |
0.3445 | 7.0 | 280 | 0.3894 | 0.8978 | 0.8095 | 0.85 | 0.8293 | 0.8216 |
0.1703 | 8.0 | 320 | 0.5518 | 0.9197 | 0.7857 | 0.9429 | 0.8571 | 0.8282 |
0.1703 | 9.0 | 360 | 0.4551 | 0.9197 | 0.8810 | 0.8605 | 0.8706 | 0.8745 |
0.0638 | 10.0 | 400 | 0.4796 | 0.9270 | 0.9286 | 0.8478 | 0.8864 | 0.9021 |
0.0638 | 11.0 | 440 | 0.5482 | 0.8905 | 0.9048 | 0.7755 | 0.8352 | 0.8606 |
0.0638 | 12.0 | 480 | 0.6586 | 0.9270 | 0.8810 | 0.8810 | 0.8810 | 0.8810 |
0.0324 | 13.0 | 520 | 0.6902 | 0.9270 | 0.8810 | 0.8810 | 0.8810 | 0.8810 |
0.0324 | 14.0 | 560 | 0.6943 | 0.9270 | 0.8810 | 0.8810 | 0.8810 | 0.8810 |
0.0208 | 15.0 | 600 | 0.6677 | 0.9197 | 0.8810 | 0.8605 | 0.8706 | 0.8745 |
Base model
huggingface/CodeBERTa-small-v1