xDAN2099 commited on
Commit
7386fd7
1 Parent(s): affab6f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -3
README.md CHANGED
@@ -5,18 +5,24 @@ license: apache-2.0
5
 
6
 
7
  APUS-xDAN-4.0-MOE
 
8
  Introduction
9
- APUS-xDAN-4.0-MOE is a transformer-based MoE decoder-only language model pretrained on a large amount of data.
10
 
11
  For more details, please refer to our blog post and GitHub repo.
12
 
13
  Model Details
14
- APUS-xDAN-4.0-MOE employs Mixture of Experts (MoE) architecture, where the models are upcycled from dense language models. For instance, APUS-xDAN-4.0-MOE is upcycled from Qwen-1.8B. It has 14.3B parameters in total and 2.7B activated parameters during runtime, while achieching comparable performance to Qwen1.5-7B, it only requires 25% of the training resources. We also observed that the inference speed is 1.74 times that of Qwen1.5-7B.
 
 
15
 
16
  Requirements
17
  The code of APUS-xDAN-4.0-MOE has been in the latest Hugging face transformers and we advise you to build from source with command pip install git+https://github.com/huggingface/transformers, or you might encounter the following error:
18
 
19
-
 
 
 
20
 
21
  License
22
  APUS-xDAN-4.0-MOE is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
 
5
 
6
 
7
  APUS-xDAN-4.0-MOE
8
+
9
  Introduction
10
+ APUS-xDAN-4.0-MOE is a transformer-based MoE decoder-only language model 对齐在 on a large amount of data.
11
 
12
  For more details, please refer to our blog post and GitHub repo.
13
 
14
  Model Details
15
+ APUS-xDAN-4.0-MOE employs Mixture of Experts (MoE) architecture, where the models are upcycled from dense language models. For instance, APUS-xDAN-4.0-MOE is upcycled from xDAN-L2 Series which are high performance alignModels. It has 136B parameters in total and 30B activated parameters during runtime.
16
+ 进过先进量化技术优化,我们的开源版本仅仅只有42GB大小,是可以很好的在消费级显卡例如4090,3090上运行。
17
+
18
 
19
  Requirements
20
  The code of APUS-xDAN-4.0-MOE has been in the latest Hugging face transformers and we advise you to build from source with command pip install git+https://github.com/huggingface/transformers, or you might encounter the following error:
21
 
22
+
23
+ Usage
24
+ llama.cpp
25
+
26
 
27
  License
28
  APUS-xDAN-4.0-MOE is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.