CharonBony commited on
Commit
c3bb977
1 Parent(s): 0cba348

update usage

Browse files
Files changed (1) hide show
  1. README.md +24 -16
README.md CHANGED
@@ -2,21 +2,21 @@
2
  license: mit
3
  license_link: https://huggingface.co/microsoft/wavecoder-ultra-6.7b/blob/main/LICENSE
4
  language:
5
- - en
6
  library_name: transformers
7
  datasets:
8
- - humaneval
9
  pipeline_tag: text-generation
10
  tags:
11
- - code
12
  metrics:
13
- - code_eval
14
  ---
 
15
  <h1 align="center">
16
  🌊 WaveCoder: Widespread And Versatile Enhanced Code LLM
17
  </h1>
18
 
19
-
20
  <p align="center">
21
  <a href="https://arxiv.org/abs/2312.14187"><b>[📜 Paper]</b></a> •
22
  <!-- <a href=""><b>[🤗 HF Models]</b></a> • -->
@@ -33,29 +33,36 @@ metrics:
33
  Repo for "<a href="https://arxiv.org/abs/2312.14187" target="_blank">WaveCoder: Widespread And Versatile Enhanced Instruction Tuning with Refined Data Generation</a>"
34
  </p>
35
 
36
-
37
  ## 🔥 News
38
 
39
- - [2024/04/10] 🔥🔥🔥 WaveCoder repo, models released at [🤗 HuggingFace](https://huggingface.co/microsoft/wavecoder-ultra-6.7b)!
40
  - [2023/12/26] WaveCoder paper released.
 
41
  ## 💡 Introduction
42
 
43
- WaveCoder 🌊 is a series of large language models (LLMs) for the coding domain, designed to solve relevant problems in the field of code through instruction-following learning. Its training dataset was generated from a subset of code-search-net data using a generator-discriminator framework based on LLMs that we proposed, covering four general code-related tasks: code generation, code summary, code translation, and code repair.
44
 
45
- | Model | HumanEval | MBPP(500) | HumanEval<br>Fix(Avg.) | HumanEval<br>Explain(Avg.)|
46
- |---|---|---|---|---|
47
- | GPT-4 | 85.4 | - | 47.8 | 52.1 |
48
- | [🌊 WaveCoder-DS-6.7B](https://huggingface.co/microsoft/wavecoder-ds-6.7b) | 65.8 | 63.0 | 49.5 | 40.8|
49
- | [🌊 WaveCoder-Pro-6.7B](https://huggingface.co/microsoft/wavecoder-pro-6.7b) | 74.4 | 63.4 | 52.1 | 43.0 |
50
- | [🌊 WaveCoder-Ultra-6.7B](https://huggingface.co/microsoft/wavecoder-ultra-6.7b) | 79.9 | 64.6 | 52.3 | 45.7 |
51
 
52
  ## 🪁 Evaluation
53
 
54
  Please refer to WaveCoder's [GitHub repo](https://github.com/microsoft/WaveCoder) for inference, evaluation, and training code.
55
 
 
 
 
 
 
 
 
56
  ## 📖 License
 
57
  This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the its [License](https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/LICENSE-MODEL).
58
-
59
 
60
  ## ☕️ Citation
61
 
@@ -69,6 +76,7 @@ If you find this repository helpful, please consider citing our paper:
69
  year={2023}
70
  }
71
  ```
 
72
  ## Note
73
 
74
- WaveCoder models are trained on the synthetic data generated by OpenAI models. Please pay attention to OpenAI's [terms of use](https://openai.com/policies/terms-of-use) when using the models and the datasets.
 
2
  license: mit
3
  license_link: https://huggingface.co/microsoft/wavecoder-ultra-6.7b/blob/main/LICENSE
4
  language:
5
+ - en
6
  library_name: transformers
7
  datasets:
8
+ - humaneval
9
  pipeline_tag: text-generation
10
  tags:
11
+ - code
12
  metrics:
13
+ - code_eval
14
  ---
15
+
16
  <h1 align="center">
17
  🌊 WaveCoder: Widespread And Versatile Enhanced Code LLM
18
  </h1>
19
 
 
20
  <p align="center">
21
  <a href="https://arxiv.org/abs/2312.14187"><b>[📜 Paper]</b></a> •
22
  <!-- <a href=""><b>[🤗 HF Models]</b></a> • -->
 
33
  Repo for "<a href="https://arxiv.org/abs/2312.14187" target="_blank">WaveCoder: Widespread And Versatile Enhanced Instruction Tuning with Refined Data Generation</a>"
34
  </p>
35
 
 
36
  ## 🔥 News
37
 
38
+ - [2024/04/10] 🔥🔥🔥 WaveCoder repo, models released at [🤗 HuggingFace](https://huggingface.co/microsoft/wavecoder-ultra-6.7b)!
39
  - [2023/12/26] WaveCoder paper released.
40
+
41
  ## 💡 Introduction
42
 
43
+ WaveCoder 🌊 is a series of large language models (LLMs) for the coding domain, designed to solve relevant problems in the field of code through instruction-following learning. Its training dataset was generated from a subset of code-search-net data using a generator-discriminator framework based on LLMs that we proposed, covering four general code-related tasks: code generation, code summary, code translation, and code repair.
44
 
45
+ | Model | HumanEval | MBPP(500) | HumanEval<br>Fix(Avg.) | HumanEval<br>Explain(Avg.) |
46
+ | -------------------------------------------------------------------------------- | --------- | --------- | ---------------------- | -------------------------- |
47
+ | GPT-4 | 85.4 | - | 47.8 | 52.1 |
48
+ | [🌊 WaveCoder-DS-6.7B](https://huggingface.co/microsoft/wavecoder-ds-6.7b) | 65.8 | 63.0 | 49.5 | 40.8 |
49
+ | [🌊 WaveCoder-Pro-6.7B](https://huggingface.co/microsoft/wavecoder-pro-6.7b) | 74.4 | 63.4 | 52.1 | 43.0 |
50
+ | [🌊 WaveCoder-Ultra-6.7B](https://huggingface.co/microsoft/wavecoder-ultra-6.7b) | 79.9 | 64.6 | 52.3 | 45.7 |
51
 
52
  ## 🪁 Evaluation
53
 
54
  Please refer to WaveCoder's [GitHub repo](https://github.com/microsoft/WaveCoder) for inference, evaluation, and training code.
55
 
56
+ ```python
57
+ # Load model directly
58
+ from transformers import AutoTokenizer, AutoModelForCausalLM
59
+ tokenizer = AutoTokenizer.from_pretrained("microsoft/wavecoder-ultra-6.7b")
60
+ model = AutoModelForCausalLM.from_pretrained("microsoft/wavecoder-ultra-6.7b")
61
+ ```
62
+
63
  ## 📖 License
64
+
65
  This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the its [License](https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/LICENSE-MODEL).
 
66
 
67
  ## ☕️ Citation
68
 
 
76
  year={2023}
77
  }
78
  ```
79
+
80
  ## Note
81
 
82
+ WaveCoder models are trained on the synthetic data generated by OpenAI models. Please pay attention to OpenAI's [terms of use](https://openai.com/policies/terms-of-use) when using the models and the datasets.