frankminors123's picture
Update README.md
5c9de66
|
raw
history blame
762 Bytes
---
license: apache-2.0
language:
- zh
- en
---
# Chinese-CodeLlama-7B-SFT-V2
We added 7k+ Python code instructions and implemented SFT based on our [Chinese-CodeLlama-7B-SFT-V1](https://huggingface.co/frankminors123/Chinese-CodeLlama-7B-SFT-V1). Drawing on the work of [code-llama](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/), we increased
the base period of rotary positional embeddings (RoPE) from 10000 to 1000000.
The Chinese prompt template used is as follows:
```python
PROMPT_TEMPLATE = (
"下面是描述一项任务的指令,并且与一则输入配对用来提供更多的上下文。请给出尽可能满足请求的回答.\n"
"### 指令:\n{instruction}\n### 输入:\n{input}\n### 回答:\n"
)
```