SecureShellBert is a CodeBert model fine-tuned for Masked Language Modelling.
The model was domain-adapted following the Huggingface guide using a corpus of >20k Unix sessions. Such sessions are both malign (see more at HaaS) and benign (see more at NLP2Bash) sessions.
The model was trained:
- For 10 epochs
- mlm probability of 0.15
- batch size = 16
- learning rate of 1e-5
- chunk size = 256
This model was used to finetuned LogPrecis. See more at GitHub for code and data, and please cite our article.
- Downloads last month
- 19
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support