(EfficientCodeBERT) CodeBERT-Based Student Model for Vulnerability Detection

This fine-tuned and distilled version of the CodeBERT model is designed for detecting vulnerabilities in source code. The custom architecture optimizes the model for efficiency, reducing size while retaining competitive accuracy. With 35 million parameters, this lightweight model offers robust performance for binary classification tasks.

Model Details:

  • Base Model: microsoft/codebert-base
  • Architecture: 384 hidden size, 8 layers, 6 attention heads
  • Max Sequence Length: 128
  • Dataset: DiverseVul
  • Task: Vulnerability detection (binary classification)
Downloads last month
12
Safetensors
Model size
37M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.