Muennighoff commited on
Commit
fb30a58
1 Parent(s): ea5895e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -4
README.md CHANGED
@@ -98,15 +98,14 @@ model-index:
98
 
99
  # Model Summary
100
 
101
- SantaCoderPack is an pre-trained model with the same architecture of SantaCoder on
102
- <th><a href=https://huggingface.co/datasets/bigcode/commitpack>CommitPack</a> using this format:
103
 
104
  ```
105
  <commit_before>code_before<commit_msg>message<commit_after>code_after
106
  ```
107
 
108
  - **Repository:** [bigcode/octopack](https://github.com/bigcode-project/octopack)
109
- - **Paper:** [TODO]()
110
  - **Languages:** Python, JavaScript, Java, C++, Go, Rust
111
  - **SantaCoderPack:**
112
  <table>
@@ -137,6 +136,7 @@ The model follows instructions provided in the input. We recommend prefacing you
137
  **Feel free to share your generations in the Community tab!**
138
 
139
  ## Generation
 
140
  ```python
141
  # pip install -q transformers
142
  from transformers import AutoModelForCausalLM, AutoTokenizer
@@ -171,4 +171,11 @@ print(tokenizer.decode(outputs[0]))
171
 
172
  # Citation
173
 
174
- TODO
 
 
 
 
 
 
 
 
98
 
99
  # Model Summary
100
 
101
+ SantaCoderPack is an pre-trained model with the same architecture of SantaCoder on <th><a href=https://huggingface.co/datasets/bigcode/commitpack>CommitPack</a> using this format:
 
102
 
103
  ```
104
  <commit_before>code_before<commit_msg>message<commit_after>code_after
105
  ```
106
 
107
  - **Repository:** [bigcode/octopack](https://github.com/bigcode-project/octopack)
108
+ - **Paper:** [OctoPack: Instruction Tuning Code Large Language Models](https://arxiv.org/abs/2308.07124)
109
  - **Languages:** Python, JavaScript, Java, C++, Go, Rust
110
  - **SantaCoderPack:**
111
  <table>
 
136
  **Feel free to share your generations in the Community tab!**
137
 
138
  ## Generation
139
+
140
  ```python
141
  # pip install -q transformers
142
  from transformers import AutoModelForCausalLM, AutoTokenizer
 
171
 
172
  # Citation
173
 
174
+ ```bibtex
175
+ @article{muennighoff2023octopack,
176
+ title={OctoPack: Instruction Tuning Code Large Language Models},
177
+ author={Niklas Muennighoff and Qian Liu and Armel Zebaze and Qinkai Zheng and Binyuan Hui and Terry Yue Zhuo and Swayam Singh and Xiangru Tang and Leandro von Werra and Shayne Longpre},
178
+ journal={arXiv preprint arXiv:2308.07124},
179
+ year={2023}
180
+ }
181
+ ```