ZenQin commited on
Commit
f66a060
1 Parent(s): e3b7323

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -12,7 +12,7 @@ license: apache-2.0
12
 
13
  ## Key Messages
14
 
15
- 1. JetMoE-8B is **trained with less than $ 0.1 million**<sup>1</sup> **cost but outperforms LLaMA2-7B from Meta AI**, who has multi-billion-dollar training resources. LLM training can be **much cheaper than people generally thought**.
16
 
17
  2. JetMoE-8B is **fully open-sourced and academia-friendly** because:
18
  - It **only uses public datasets** for training, and the code is open-sourced. No proprietary resource is needed.
 
12
 
13
  ## Key Messages
14
 
15
+ 1. JetMoE-8B is **trained with less than $ 0.1 million**<sup>1</sup> **cost but outperforms LLaMA2-7B from Meta AI**, who has multi-billion-dollar training resources. LLM training can be **much cheaper than people previously thought**.
16
 
17
  2. JetMoE-8B is **fully open-sourced and academia-friendly** because:
18
  - It **only uses public datasets** for training, and the code is open-sourced. No proprietary resource is needed.