loubnabnl HF staff commited on
Commit
8a737b0
1 Parent(s): cc14b64
Files changed (1) hide show
  1. architectures/polycoder.txt +1 -1
architectures/polycoder.txt CHANGED
@@ -1,4 +1,4 @@
1
- [PolyCoder](https://github.com/VHellendoorn/Code-LMs) uses GPT2 architecture, with BPE tokenizer trained on a random 5% subset of the data (all languages), and a context mength of 2048. To study the effect of scaling of model size, the odel was trained in 3 different sizes.
2
 
3
  <div align="center">
4
 
 
1
+ [PolyCoder](https://github.com/VHellendoorn/Code-LMs) uses GPT2 architecture, with BPE tokenizer trained on a random 5% subset of the data (all languages), and a context length of 2048. To study the effect of scaling of model size, the odel was trained in 3 different sizes.
2
 
3
  <div align="center">
4