ymcui commited on
Commit
639a3e7
•
1 Parent(s): de0b1a2

update info

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -5,6 +5,8 @@ tags:
5
  - bert
6
  license: "apache-2.0"
7
  ---
 
 
8
  ## Chinese BERT with Whole Word Masking
9
  For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
10
 
@@ -51,4 +53,4 @@ If you find the technical report or resource is useful, please cite the followin
51
  journal={arXiv preprint arXiv:1906.08101},
52
  year={2019}
53
  }
54
- ```
5
  - bert
6
  license: "apache-2.0"
7
  ---
8
+ # This is a re-trained 4-layer RoBERTa-wwm-ext model.
9
+
10
  ## Chinese BERT with Whole Word Masking
11
  For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
12
 
53
  journal={arXiv preprint arXiv:1906.08101},
54
  year={2019}
55
  }
56
+ ```