weiqipedia commited on
Commit
3455559
1 Parent(s): 2eb8615

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -42,7 +42,7 @@ The SEA-LION-BERT model is built on the MosaicBERT architecture and has a vocabu
42
 
43
  For tokenization, the model employs our custom SEABPETokenizer, which is specially tailored for SEA languages, ensuring optimal model performance.
44
 
45
- The training data for SEA-LION-BERT encompasses 980B tokens.
46
 
47
  - **Developed by:** Products Pillar, AI Singapore
48
  - **Funded by:** Singapore NRF
 
42
 
43
  For tokenization, the model employs our custom SEABPETokenizer, which is specially tailored for SEA languages, ensuring optimal model performance.
44
 
45
+ The training data for SEA-LION-BERT encompasses 790B tokens.
46
 
47
  - **Developed by:** Products Pillar, AI Singapore
48
  - **Funded by:** Singapore NRF