ymcui commited on
Commit
ebd5745
•
1 Parent(s): 68f50eb

update info

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -5,6 +5,9 @@ tags:
5
  - bert
6
  license: "apache-2.0"
7
  ---
 
 
 
8
  ## Chinese BERT with Whole Word Masking
9
  For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
10
 
@@ -51,4 +54,4 @@ If you find the technical report or resource is useful, please cite the followin
51
  journal={arXiv preprint arXiv:1906.08101},
52
  year={2019}
53
  }
54
- ```
 
5
  - bert
6
  license: "apache-2.0"
7
  ---
8
+
9
+ # This is a re-trained 3-layer RoBERTa-wwm-ext model.
10
+
11
  ## Chinese BERT with Whole Word Masking
12
  For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
13
 
 
54
  journal={arXiv preprint arXiv:1906.08101},
55
  year={2019}
56
  }
57
+ ```