Edit model card

This is an unofficial reupload of microsoft/graphcodebert-base in the SafeTensors format using transformers 4.40.1. The goal of this reupload is to prevent older models that are still relevant baselines from becoming stale as a result of changes in HuggingFace. Additionally, I may include minor corrections, such as model max length configuration.

Original model card below:


GraphCodeBERT model

GraphCodeBERT is a graph-based pre-trained model based on the Transformer architecture for programming language, which also considers data-flow information along with code sequences. GraphCodeBERT consists of 12 layers, 768 dimensional hidden states, and 12 attention heads. The maximum sequence length for the model is 512. The model is trained on the CodeSearchNet dataset, which includes 2.3M functions with document pairs for six programming languages.

More details can be found in the paper by Guo et. al.

Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face community members.

Downloads last month
13
Safetensors
Model size
125M params
Tensor type
F32
·