File size: 473 Bytes
f34e776 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
## JavaBERT
A BERT-like model pretrained on Java software code.
### Training Data
The model is trained on 2,998,345 Java files retrieved from Open Source projects on GitHub.
### Training Objective
MLM (Masked Language Model) objective was used to train this model.
### Usage
```python
from transformers import pipeline
pipe = pipeline('fill-mask', model='CAUKiel/JavaBERT')
output = pipe(CODE) # Replace with Java code; Use '[MASK]' to mask tokens/words in the code.
``` |