KoichiYasuoka commited on
Commit
4d8a975
1 Parent(s): e2708f6
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -24,7 +24,7 @@ widget:
24
 
25
  ## Model Description
26
 
27
- This is a BERT model pre-trained on Classical Chinese texts for dependency-parsing (head-detection on long-unit-words) as question-answering, derived from [bert-ancient-chinese](https://huggingface.co/Jihuai/bert-ancient-chinese) and [UD_Classical_Chinese-Kyoto](https://github.com/UniversalDependencies/UD_Classical_Chinese-Kyoto). Use [MASK] inside `context` to avoid ambiguity when specifying a multiple-used word as `question`.
28
 
29
  ## How to Use
30
 
 
24
 
25
  ## Model Description
26
 
27
+ This is a BERT model pre-trained on Classical Chinese texts for dependency-parsing (head-detection on Universal Dependencies) as question-answering, derived from [bert-ancient-chinese](https://huggingface.co/Jihuai/bert-ancient-chinese) and [UD_Classical_Chinese-Kyoto](https://github.com/UniversalDependencies/UD_Classical_Chinese-Kyoto). Use [MASK] inside `context` to avoid ambiguity when specifying a multiple-used word as `question`.
28
 
29
  ## How to Use
30