DataLinguistic
commited on
Commit
•
5b8c5f7
1
Parent(s):
b0bfba5
Update README.md
Browse files
README.md
CHANGED
@@ -16,20 +16,21 @@ model-index:
|
|
16 |
metrics:
|
17 |
- name: pass@1
|
18 |
type: pass@1
|
19 |
-
value: 0.
|
20 |
verified: false
|
21 |
|
|
|
22 |
---
|
23 |
|
24 |
-
# DataLinguistic-
|
25 |
|
26 |
## Model Overview
|
27 |
|
28 |
-
DataLinguistic-34B-
|
29 |
|
30 |
## Model Architecture
|
31 |
|
32 |
-
DataLinguistic-34B-4bit-V1.0 inherits the encoder-decoder structure from Llama with 34B parameters.
|
33 |
|
34 |
## Training Datasets
|
35 |
|
@@ -53,7 +54,6 @@ The model can be used for a wide range of Chinese-English question answering and
|
|
53 |
## Model Advantages
|
54 |
|
55 |
- Based on huge model CodeLlama-34b with 34B parameters
|
56 |
-
- 4-bit quantization reduces compute
|
57 |
- Fine-tuned on large-scale Chinese-English QA datasets for high quality
|
58 |
|
59 |
## Usage
|
@@ -64,7 +64,7 @@ The model can be used for a wide range of Chinese-English question answering and
|
|
64 |
|
65 |
## Version
|
66 |
|
67 |
-
Current version: DataLinguistic-34B-
|
68 |
|
69 |
## Author
|
70 |
|
@@ -74,15 +74,15 @@ Tang Zhengzheng
|
|
74 |
|
75 |
DataLinguistic team
|
76 |
|
77 |
-
# DataLinguistic-34B-
|
78 |
|
79 |
## 模型简介
|
80 |
|
81 |
-
DataLinguistic-34B-
|
82 |
|
83 |
## 模型结构
|
84 |
|
85 |
-
DataLinguistic-34B-
|
86 |
|
87 |
## 模型训练数据集
|
88 |
|
@@ -103,7 +103,6 @@ DataLinguistic-34B-4bit-V1.0 inherits the encoder-decoder structure from CodeLla
|
|
103 |
## 模型优势
|
104 |
|
105 |
- 基于大模型Llama-34b,参数量达34亿
|
106 |
-
- 采用4bit量化,降低运算量
|
107 |
- 在大规模中英文问答数据集上进行微调,质量较高
|
108 |
|
109 |
## 使用步骤
|
@@ -114,7 +113,7 @@ DataLinguistic-34B-4bit-V1.0 inherits the encoder-decoder structure from CodeLla
|
|
114 |
|
115 |
## 版本信息
|
116 |
|
117 |
-
当前版本:DataLinguistic-34B-
|
118 |
|
119 |
## 作者
|
120 |
|
|
|
16 |
metrics:
|
17 |
- name: pass@1
|
18 |
type: pass@1
|
19 |
+
value: 0.701
|
20 |
verified: false
|
21 |
|
22 |
+
|
23 |
---
|
24 |
|
25 |
+
# DataLinguistic-34B-V1.0 Chinese-English Question Answering Model
|
26 |
|
27 |
## Model Overview
|
28 |
|
29 |
+
DataLinguistic-34B-V1.0 is a Chinese-English question answering model fine-tuned from Huggingface's CodeLlama-34b model with 4-bit quantization on DataLinguistic's proprietary datasets.
|
30 |
|
31 |
## Model Architecture
|
32 |
|
33 |
+
DataLinguistic-34B-4bit-V1.0 inherits the encoder-decoder structure from Llama with 34B parameters.
|
34 |
|
35 |
## Training Datasets
|
36 |
|
|
|
54 |
## Model Advantages
|
55 |
|
56 |
- Based on huge model CodeLlama-34b with 34B parameters
|
|
|
57 |
- Fine-tuned on large-scale Chinese-English QA datasets for high quality
|
58 |
|
59 |
## Usage
|
|
|
64 |
|
65 |
## Version
|
66 |
|
67 |
+
Current version: DataLinguistic-34B-V1.0
|
68 |
|
69 |
## Author
|
70 |
|
|
|
74 |
|
75 |
DataLinguistic team
|
76 |
|
77 |
+
# DataLinguistic-34B-V1.0 中英文问答模型
|
78 |
|
79 |
## 模型简介
|
80 |
|
81 |
+
DataLinguistic-34B-V1.0是一个基于Huggingface的CodeLlama-34b模型在DataLinguistic自建数据集上微调的中文英文问答模型。
|
82 |
|
83 |
## 模型结构
|
84 |
|
85 |
+
DataLinguistic-34B-V1.0 inherits the encoder-decoder structure from CodeLlama with 34B parameters.
|
86 |
|
87 |
## 模型训练数据集
|
88 |
|
|
|
103 |
## 模型优势
|
104 |
|
105 |
- 基于大模型Llama-34b,参数量达34亿
|
|
|
106 |
- 在大规模中英文问答数据集上进行微调,质量较高
|
107 |
|
108 |
## 使用步骤
|
|
|
113 |
|
114 |
## 版本信息
|
115 |
|
116 |
+
当前版本:DataLinguistic-34B-V1.0
|
117 |
|
118 |
## 作者
|
119 |
|