Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
1 |
# Converted Qwen2 from InternLM2.5-7B-Chat
|
2 |
|
3 |
## Descritpion
|
@@ -88,5 +93,4 @@ print(f"User Input: {prompt2}\nConverted LlaMA Response: {response_qwen2_and_spl
|
|
88 |
To comare result with the original model, you can use this [code](https://github.com/silencelamb/naked_llama/blob/main/hf_example/hf_internlm_7b_qwen2_compare.py)
|
89 |
|
90 |
## More Info
|
91 |
-
It was converted using the python script available at [this repository](https://github.com/silencelamb/naked_llama/blob/main/hf_example/convert_internlm_to_qwen_hf.py)
|
92 |
-
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
base_model:
|
4 |
+
- internlm/internlm2_5-7b-chat
|
5 |
+
---
|
6 |
# Converted Qwen2 from InternLM2.5-7B-Chat
|
7 |
|
8 |
## Descritpion
|
|
|
93 |
To comare result with the original model, you can use this [code](https://github.com/silencelamb/naked_llama/blob/main/hf_example/hf_internlm_7b_qwen2_compare.py)
|
94 |
|
95 |
## More Info
|
96 |
+
It was converted using the python script available at [this repository](https://github.com/silencelamb/naked_llama/blob/main/hf_example/convert_internlm_to_qwen_hf.py)
|
|