YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Tokenizer

We trained our tokenizer using sentencepiece's unigram tokenizer. Then loaded the tokenizer as MT5TokenizerFast.

Model

We used MT5-base model.

Datasets

We used Code Search Net's dataset and some scrapped data from internet to train the model. We maintained a list of datasets where each dataset had codes of same language.

Plots

Train loss

train loss

Evaluation loss

eval loss

Evaluation accuracy

eval accuracy

Learning rate

learning rate

Fine tuning (WIP)

We fine tuned the model with CodeXGLUE code-to-code-trans dataset, and scrapper data.

Downloads last month
15
Safetensors
Model size
241M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.