KoichiYasuoka commited on
Commit
b10e7f6
1 Parent(s): 4ae23d8

link with dependency-parsing

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -16,7 +16,7 @@ widget:
16
 
17
  ## Model Description
18
 
19
- This is a BERT model pre-trained on Japanese Wikipedia texts, derived from [bert-large-japanese-char](https://huggingface.co/cl-tohoku/bert-large-japanese-char). Character-embeddings are enhanced to include all 常用漢字/人名用漢字 characters using [BertTokenizerFast](https://huggingface.co/transformers/model_doc/bert.html#berttokenizerfast). You can fine-tune `bert-large-japanese-char-extended` for downstream tasks, such as [POS-tagging](https://huggingface.co/KoichiYasuoka/bert-large-japanese-upos), dependency-parsing, and so on.
20
 
21
  ## How to Use
22
 
16
 
17
  ## Model Description
18
 
19
+ This is a BERT model pre-trained on Japanese Wikipedia texts, derived from [bert-large-japanese-char](https://huggingface.co/cl-tohoku/bert-large-japanese-char). Character-embeddings are enhanced to include all 常用漢字/人名用漢字 characters using BertTokenizerFast. You can fine-tune `bert-large-japanese-char-extended` for downstream tasks, such as [POS-tagging](https://huggingface.co/KoichiYasuoka/bert-large-japanese-upos), [dependency-parsing](https://huggingface.co/KoichiYasuoka/bert-large-japanese-wikipedia-ud-head), and so on.
20
 
21
  ## How to Use
22