guoday commited on
Commit
3053526
1 Parent(s): 2a1cbc2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +58 -0
README.md CHANGED
@@ -3,3 +3,61 @@ license: other
3
  license_name: deepseek
4
  license_link: LICENSE
5
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  license_name: deepseek
4
  license_link: LICENSE
5
  ---
6
+
7
+
8
+
9
+ ### 1. Introduction of Deepseek Coder
10
+
11
+ Deepseek Coder comprises a series of code language models trained on both 87% code and 13% natural language in English and Chinese, with each model pre-trained on 2T tokens. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on project-level code corpus by employing a window size of 16K and a extra fill-in-the-blank task, to support project-level code completion and infilling. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
12
+
13
+ - **Massive Training Data**: Trained on 2T tokens, including 87% code and 13% linguistic data in both English and Chinese languages.
14
+
15
+ - **Highly Flexible & Scalable**: Offered in model sizes of 1.3B, 5.7B, 6.7B, and 33B, enabling users to choose the setup most suitable for their requirements.
16
+
17
+ - **Superior Model Performance**: State-of-the-art performance among publicly available code models on HumanEval, MultiPL-E, MBPP, DS-1000, and APPS benchmarks.
18
+
19
+ - **Advanced Code Completion Capabilities**: A window size of 16K and a fill-in-the-blank task, supporting project-level code completion and infilling tasks.
20
+
21
+
22
+
23
+ ### 2. Model Summary
24
+ deepseek-coder-33b-instruct is a 33B parameter model initialized from deepseek-coder-33b-base and fine-tuned on 2B tokens of instruction data.
25
+ - **Home Page:** [DeepSeek](https://deepseek.com/)
26
+ - **Repository:** [deepseek-ai/deepseek-coder](https://github.com/deepseek-ai/deepseek-coder)
27
+ - **Chat With DeepSeek Coder:** [DeepSeek-Coder](https://coder.deepseek.com/)
28
+
29
+
30
+ ### 3. How to Use
31
+ Here give some examples of how to use our model.
32
+ #### Chat Model Inference
33
+ ```python
34
+ from transformers import AutoTokenizer, AutoModelForCausalLM
35
+ tokenizer = AutoTokenizer.from_pretrained("deepseek-coder-33b-instruct", trust_remote_code=True)
36
+ model = AutoModelForCausalLM.from_pretrained("deepseek-coder-33b-instruct", trust_remote_code=True).cuda()
37
+ system_prompt = "You are an AI programming assistant, utilizing the Deepseek Coder model, developed by Deepseek Company, and you only answer questions related to computer science. For politically sensitive questions, security and privacy issues, and other non-computer science questions, you will refuse to answer.\n"
38
+ messages=[
39
+ { 'role': 'user', 'content': "write a quick sort algorithm in python."}
40
+ ]
41
+ prompt = system_prompt
42
+ for content in messages:
43
+ if content['role'] == 'user':
44
+ prompt += f"### Instruction:\n{content['content']}\n"
45
+ else:
46
+ prompt += f"### Response:\n{content['content']}\n"
47
+ prompt += "<|EOT|>\n"
48
+ prompt += "### Response:\n"
49
+ inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
50
+ # 32021 is the id of <|EOT|> token
51
+ outputs = model.generate(**inputs, max_new_tokens=256, do_sample=True, top_k=50, top_p=0.95, num_return_sequences=1, eos_token_id=32021)
52
+ print(tokenizer.decode(outputs[0]))
53
+ ```
54
+
55
+ ### 4. Lincense
56
+ This code repository is licensed under the MIT License. The use of DeepSeek Coder model and weights is subject to the Model License. DeepSeek Coder supports commercial use.
57
+
58
+ See the [LICENSE-MODEL](https://github.com/deepseek-ai/deepseek-coder/blob/main/LICENSE-MODEL) for more details.
59
+
60
+ ### 5. Contact
61
+
62
+ If you have any questions, please raise an issue or contact us at [agi_code@deepseek.com](mailto:agi_code@deepseek.com).
63
+