freedomking
commited on
Commit
•
bc2a47b
1
Parent(s):
ae498db
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,4 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
|
|
|
1 |
+
MC-BERT is a novel conceptualized representation learning approach for the medical domain. First, we use a different mask generation procedure to mask spans of tokens, rather than only random ones. We also introduce two kinds of masking strategies, namely whole entity masking and whole span masking. Finally, MC-BERT split the input document into segments based on the actual "sentences" provided by the user as positive samples and sample random sentences from other documents as negative samples for the next sentence prediction.
|
2 |
+
|
3 |
+
|
4 |
+
![](https://github.com/alibaba-research/ChineseBLUE/raw/master/figs/c_bert_model.jpg)
|