InCoder: A Generative Model for Code Infilling and Synthesis

Demo of the 6.7B parameter version of InCoder: a decoder-only Transformer model that can both extend and insert/infill code.

Select one of the examples below, or input your own code into the editor. You can type <infill> to mark a location you want the model to insert code at.

Click "Extend" to append text at the end of the editor. Click "Infill" to replace all <infill> masks. (Click "Add <infill> mask" to add a mask at the cursor or replace the current selection.)

64
0.6



Syntax:

Messages

Generation queued, please wait...

More Info

See our project site for more information on these models, including a paper and examples.

For instructions on setting up and using the models (via HuggingFace transformers), see our readme.

Credits

This model was developed at Facebook AI Research by Daniel Fried*, Armen Aghajanyan*, Jessy Lin, Sida Wang, Eric Wallace, Freda Shi, Ruiqi Zhong, Wen-tau Yih, Luke Zettlemoyer, and Mike Lewis.

Thanks to Naman Goyal and Stephen Roller for writing the code this demo was based on. Extensions by Daniel Fried and Sida Wang.