VLAI for CWE Guessing
Collection
A collection of papers, models, and datasets supporting the AI and NLP components of the Vulnerability-Lookup project, for CWE guessing.
•
9 items
•
Updated
This model is a fine-tuned version of microsoft/codebert-base on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro |
---|---|---|---|---|---|
3.2599 | 1.0 | 25 | 3.2873 | 0.0225 | 0.0037 |
3.1436 | 2.0 | 50 | 3.3011 | 0.0562 | 0.0216 |
3.1168 | 3.0 | 75 | 3.3439 | 0.0449 | 0.0113 |
3.0315 | 4.0 | 100 | 3.3314 | 0.1461 | 0.0645 |
3.0604 | 5.0 | 125 | 3.3334 | 0.0899 | 0.0581 |
2.9746 | 6.0 | 150 | 3.3430 | 0.1124 | 0.0546 |
2.9773 | 7.0 | 175 | 3.3535 | 0.4157 | 0.0990 |
2.8666 | 8.0 | 200 | 3.2720 | 0.4831 | 0.2052 |
2.8196 | 9.0 | 225 | 3.2289 | 0.4270 | 0.1442 |
2.6704 | 10.0 | 250 | 3.1301 | 0.2584 | 0.1440 |
2.6964 | 11.0 | 275 | 3.0508 | 0.2809 | 0.1197 |
2.5442 | 12.0 | 300 | 2.9618 | 0.3596 | 0.1644 |
2.4519 | 13.0 | 325 | 2.9271 | 0.3596 | 0.1637 |
2.4064 | 14.0 | 350 | 2.8342 | 0.3933 | 0.2154 |
2.2469 | 15.0 | 375 | 2.7950 | 0.3596 | 0.2097 |
2.1662 | 16.0 | 400 | 2.7928 | 0.3596 | 0.1926 |
2.126 | 17.0 | 425 | 2.6786 | 0.4157 | 0.2223 |
2.0579 | 18.0 | 450 | 2.7615 | 0.3820 | 0.1987 |
1.8908 | 19.0 | 475 | 2.6469 | 0.4157 | 0.2015 |
1.8119 | 20.0 | 500 | 2.7396 | 0.4157 | 0.2097 |
1.8234 | 21.0 | 525 | 2.7319 | 0.3933 | 0.2101 |
1.6483 | 22.0 | 550 | 2.7024 | 0.4607 | 0.2504 |
1.7195 | 23.0 | 575 | 2.6693 | 0.4944 | 0.2345 |
1.5326 | 24.0 | 600 | 2.6387 | 0.5169 | 0.2341 |
1.5649 | 25.0 | 625 | 2.6509 | 0.6180 | 0.2934 |
1.4294 | 26.0 | 650 | 2.7232 | 0.6292 | 0.3175 |
1.4872 | 27.0 | 675 | 2.6745 | 0.6404 | 0.3005 |
1.3451 | 28.0 | 700 | 2.6499 | 0.6517 | 0.3100 |
1.296 | 29.0 | 725 | 2.6788 | 0.6517 | 0.3290 |
1.2962 | 30.0 | 750 | 2.6351 | 0.6517 | 0.3129 |
1.2969 | 31.0 | 775 | 2.6432 | 0.6742 | 0.3226 |
1.1886 | 32.0 | 800 | 2.6496 | 0.6742 | 0.3226 |
1.1426 | 33.0 | 825 | 2.6603 | 0.6742 | 0.3230 |
1.1833 | 34.0 | 850 | 2.6660 | 0.6742 | 0.3253 |
1.14 | 35.0 | 875 | 2.6588 | 0.6854 | 0.3477 |
1.0947 | 36.0 | 900 | 2.6501 | 0.6854 | 0.3477 |
1.0714 | 37.0 | 925 | 2.6654 | 0.6854 | 0.3477 |
1.0678 | 38.0 | 950 | 2.6454 | 0.6854 | 0.3477 |
1.0535 | 39.0 | 975 | 2.6375 | 0.6854 | 0.3481 |
1.0273 | 40.0 | 1000 | 2.6422 | 0.6854 | 0.3481 |
Base model
microsoft/codebert-base