Text Generation
Transformers
PyTorch
Safetensors
code
Eval Results
Inference Endpoints
Muennighoff commited on
Commit
5181078
1 Parent(s): 2c3af99

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -15
README.md CHANGED
@@ -231,20 +231,14 @@ model-index:
231
 
232
  ![Octopack](https://github.com/bigcode-project/octopack/blob/31f3320f098703c7910e43492c39366eeea68d83/banner.png?raw=true)
233
 
234
- # OctoCoder
235
 
236
- Play with the model on the [TODO Playground](https://huggingface.co/spaces/bigcode/bigcode-playground).
 
 
 
237
 
238
- ## Table of Contents
239
-
240
- 1. [Model Summary](##model-summary)
241
- 2. [Use](##use)
242
- 3. [Limitations](##limitations)
243
- 4. [Training](##training)
244
- 5. [License](##license)
245
- 6. [Citation](##citation)
246
-
247
- ## Model Summary
248
 
249
  OctoCoder is an instruction tuned model with 15.5B parameters created by finetuning StarCoder on CommitPackFT & OASST as described in the OctoPack paper.
250
 
@@ -281,15 +275,15 @@ OctoCoder is an instruction tuned model with 15.5B parameters created by finetun
281
  </table>
282
 
283
 
284
- ## Use
285
 
286
- ### Intended use
287
 
288
  The model follows instructions provided in the input. We recommend prefacing your input with "Question: " and finishing with "Answer:", for example: "Question: Please write a function in Python that performs bubble sort.\n\nAnswer:"
289
 
290
  **Feel free to share your generations in the Community tab!**
291
 
292
- ### Generation
293
  ```python
294
  # pip install -q transformers
295
  from transformers import AutoModelForCausalLM, AutoTokenizer
 
231
 
232
  ![Octopack](https://github.com/bigcode-project/octopack/blob/31f3320f098703c7910e43492c39366eeea68d83/banner.png?raw=true)
233
 
234
+ # Table of Contents
235
 
236
+ 1. [Model Summary](#model-summary)
237
+ 2. [Use](#use)
238
+ 3. [Training](#training)
239
+ 4. [Citation](#citation)
240
 
241
+ # Model Summary
 
 
 
 
 
 
 
 
 
242
 
243
  OctoCoder is an instruction tuned model with 15.5B parameters created by finetuning StarCoder on CommitPackFT & OASST as described in the OctoPack paper.
244
 
 
275
  </table>
276
 
277
 
278
+ # Use
279
 
280
+ ## Intended use
281
 
282
  The model follows instructions provided in the input. We recommend prefacing your input with "Question: " and finishing with "Answer:", for example: "Question: Please write a function in Python that performs bubble sort.\n\nAnswer:"
283
 
284
  **Feel free to share your generations in the Community tab!**
285
 
286
+ ## Generation
287
  ```python
288
  # pip install -q transformers
289
  from transformers import AutoModelForCausalLM, AutoTokenizer