xDAN2099 commited on
Commit
affab6f
1 Parent(s): 133020d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -4,8 +4,6 @@ license: apache-2.0
4
 
5
 
6
 
7
-
8
-
9
  APUS-xDAN-4.0-MOE
10
  Introduction
11
  APUS-xDAN-4.0-MOE is a transformer-based MoE decoder-only language model pretrained on a large amount of data.
@@ -20,5 +18,5 @@ The code of APUS-xDAN-4.0-MOE has been in the latest Hugging face transformers a
20
 
21
 
22
 
23
- Usage
24
- We do not advise you to use base language models for text generation. Instead, you can apply post-training, e.g., SFT, RLHF, continued pretraining, etc., on this model.
 
4
 
5
 
6
 
 
 
7
  APUS-xDAN-4.0-MOE
8
  Introduction
9
  APUS-xDAN-4.0-MOE is a transformer-based MoE decoder-only language model pretrained on a large amount of data.
 
18
 
19
 
20
 
21
+ License
22
+ APUS-xDAN-4.0-MOE is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.