cbert / README.md
claudios's picture
Update README.md
bf2b9e4 verified
|
raw
history blame contribute delete
No virus
509 Bytes
metadata
license: mit
arxiv: 2302.04026
pipeline_tag: fill-mask
tags:
  - code

C-BERT MLM

Exploring Software Naturalness through Neural Language Models

Overview

This model is the unofficial HuggingFace version of "C-BERT" with just the masked language modeling head for pretraining. The weights come from "An Empirical Comparison of Pre-Trained Models of Source Code". Please cite the authors if you use this in an academic setting.