KoichiYasuoka commited on
Commit
2fcc4e8
β€’
1 Parent(s): bc67de7

dependency-parsing

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -6,6 +6,7 @@ tags:
6
  - "token-classification"
7
  - "pos"
8
  - "wikipedia"
 
9
  datasets:
10
  - "universal_dependencies"
11
  license: "apache-2.0"
@@ -16,7 +17,7 @@ pipeline_tag: "token-classification"
16
 
17
  ## Model Description
18
 
19
- This is a BERT model pre-trained on Chinese Wikipedia texts (both simplified and traditional) for POS-tagging, derived from [chinese-roberta-wwm-ext](https://huggingface.co/hfl/chinese-roberta-wwm-ext). Every word is tagged by [UPOS](https://universaldependencies.org/u/pos/) (Universal Part-Of-Speech).
20
 
21
  ## How to Use
22
 
@@ -26,6 +27,13 @@ tokenizer=AutoTokenizer.from_pretrained("KoichiYasuoka/chinese-roberta-base-upos
26
  model=AutoModelForTokenClassification.from_pretrained("KoichiYasuoka/chinese-roberta-base-upos")
27
  ```
28
 
 
 
 
 
 
 
 
29
  ## See Also
30
 
31
  [esupar](https://github.com/KoichiYasuoka/esupar): Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa models
 
6
  - "token-classification"
7
  - "pos"
8
  - "wikipedia"
9
+ - "dependency-parsing"
10
  datasets:
11
  - "universal_dependencies"
12
  license: "apache-2.0"
 
17
 
18
  ## Model Description
19
 
20
+ This is a BERT model pre-trained on Chinese Wikipedia texts (both simplified and traditional) for POS-tagging and dependency-parsing, derived from [chinese-roberta-wwm-ext](https://huggingface.co/hfl/chinese-roberta-wwm-ext). Every word is tagged by [UPOS](https://universaldependencies.org/u/pos/) (Universal Part-Of-Speech).
21
 
22
  ## How to Use
23
 
 
27
  model=AutoModelForTokenClassification.from_pretrained("KoichiYasuoka/chinese-roberta-base-upos")
28
  ```
29
 
30
+ or
31
+
32
+ ```py
33
+ import esupar
34
+ nlp=esupar.load("KoichiYasuoka/chinese-roberta-base-upos")
35
+ ```
36
+
37
  ## See Also
38
 
39
  [esupar](https://github.com/KoichiYasuoka/esupar): Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa models