loubnabnl's picture
loubnabnl HF staff
update
85092ab
raw history blame
No virus
429 Bytes
Various architectures are used in code generation models, but most of them use the auto-regressive left-to-right setting, such as GPT. However InCoder used a decoder-only Transformer with Causal Masking objective,
that combines both next token prediction and bidirectional context through masking. AlphaCode used an encoder-decoder architecture. For model-specific information about the architecture, please select a model below: