ConPlag Experiments
Collection
48 items
โข
Updated
This model is a fine-tuned version of huggingface/CodeBERTa-small-v1 on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Accuracy | Recall | Precision | F1 | F Beta Score |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 40 | 0.6090 | 0.7737 | 0.7381 | 0.6078 | 0.6667 | 0.6924 |
No log | 2.0 | 80 | 0.5145 | 0.7372 | 0.9048 | 0.5429 | 0.6786 | 0.7508 |
0.6234 | 3.0 | 120 | 0.5056 | 0.8467 | 0.5714 | 0.8889 | 0.6957 | 0.6420 |
0.6234 | 4.0 | 160 | 0.4510 | 0.8832 | 0.6667 | 0.9333 | 0.7778 | 0.7309 |
0.3658 | 5.0 | 200 | 0.4820 | 0.8686 | 0.6190 | 0.9286 | 0.7429 | 0.6898 |
0.3658 | 6.0 | 240 | 0.4417 | 0.9051 | 0.7143 | 0.9677 | 0.8219 | 0.7769 |
0.3658 | 7.0 | 280 | 0.4399 | 0.8759 | 0.8571 | 0.7660 | 0.8090 | 0.8269 |
0.2126 | 8.0 | 320 | 0.5597 | 0.8978 | 0.7143 | 0.9375 | 0.8108 | 0.7708 |
0.2126 | 9.0 | 360 | 0.4830 | 0.9343 | 0.8333 | 0.9459 | 0.8861 | 0.8650 |
0.1306 | 10.0 | 400 | 0.5402 | 0.9270 | 0.8095 | 0.9444 | 0.8718 | 0.8467 |
0.1306 | 11.0 | 440 | 0.6326 | 0.9197 | 0.7857 | 0.9429 | 0.8571 | 0.8282 |
0.1306 | 12.0 | 480 | 0.5699 | 0.8978 | 0.8571 | 0.8182 | 0.8372 | 0.8448 |
0.0878 | 13.0 | 520 | 0.7407 | 0.8978 | 0.7619 | 0.8889 | 0.8205 | 0.7969 |
0.0878 | 14.0 | 560 | 0.6386 | 0.9197 | 0.8333 | 0.8974 | 0.8642 | 0.8521 |
Base model
huggingface/CodeBERTa-small-v1