File size: 364 Bytes
ae7296d
 
 
764595e
c0b7641
 
 
1
2
3
4
5
6
7
---
license: apache-2.0
---

StarCoder finetuned on 20k AlpacaCode Dataset.

The StarCoder models are 15.5B parameter models trained on 80+ programming languages from The Stack (v1.2), with opt-out requests excluded. The model uses Multi Query Attention, a context window of 8192 tokens, and was trained using the Fill-in-the-Middle objective on 1 trillion tokens.