Update README.md
Browse files
README.md
CHANGED
@@ -6,7 +6,11 @@ Introduction
|
|
6 |
|
7 |
APUS-xDAN-4.0-MOE is a transformer-based decoder-only language model, developed on a vast corpus of data to ensure robust performance.
|
8 |
|
|
|
|
|
|
|
9 |
For more comprehensive information, please visit our blog post and GitHub repository.
|
|
|
10 |
|
11 |
Model Details
|
12 |
|
|
|
6 |
|
7 |
APUS-xDAN-4.0-MOE is a transformer-based decoder-only language model, developed on a vast corpus of data to ensure robust performance.
|
8 |
|
9 |
+
This is an enhanced MoE (Mixture of Experts) model built on top of the continued pre-training enhanced LlaMA architecture,
|
10 |
+
further optimized with human-enhanced feedback algorithms to improve reasoning, mathematical, and logical capabilities during inference.
|
11 |
+
|
12 |
For more comprehensive information, please visit our blog post and GitHub repository.
|
13 |
+
https://github.com/shootime2021/APUS-xDAN-4.0-moe
|
14 |
|
15 |
Model Details
|
16 |
|