# NinedayWang /PolyCoder-2.7B

84cbd49

This is a PolyCoder model with 2.7B parameters, presented in the paper "A Systematic Evaluation of Large Language Models of Code" (MAPS'2022 and ICLR'2022 Workshop Deep Learning 4 Code).

The model was trained on 249 GB of code across 12 programming languages.

Note - this model requires transformers version of at least 4.23.0:

pip install transformers==4.23.0


@inproceedings{