yacht commited on
Commit
85dd6f9
β€’
1 Parent(s): d2f58a9

add README.md for model card

Browse files
Files changed (1) hide show
  1. README.md +39 -0
README.md ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: zh
3
+ license: cc-by-sa-4.0
4
+ tags:
5
+ - word segmentation
6
+ datasets:
7
+ - ctb6
8
+ - as
9
+ - cityu
10
+ - msra
11
+ - pku
12
+ - sxu
13
+ - cnc
14
+ ---
15
+
16
+ # Multi-criteria BERT base Chinese with Lattice for Word Segmentation
17
+
18
+ This is a variant of the pre-trained model [BERT](https://github.com/google-research/bert) model
19
+ The model was pre-trained on texts in the Chinese language and fine-tuned for word segmentation.
20
+ This version of the model processes input texts with character-level with word-level incorporated with a lattice structure.
21
+
22
+ The scripts for the pre-training are available at [tchayintr/latte-ptm-ws](https://github.com/tchayintr/latte-ptm-ws).
23
+
24
+ ## Model architecture
25
+
26
+ The model architecture is described in this [paper](https://www.jstage.jst.go.jp/article/jnlp/30/2/30_456/_article/-char/ja).
27
+
28
+ ## Training Data
29
+
30
+ The model is trained on multiple Chinese word segmented datasets, including CTB6, SIGHAN2005 (AS, CITYU, MSRA, PKU), SIGHAN2008 (SXU), and CNC.
31
+ The datasets can be accessed from [here](https://github.com/hankcs/multi-criteria-cws/tree/master/data).
32
+
33
+ ## Licenses
34
+
35
+ The pre-trained model is distributed under the terms of the [Creative Commons Attribution-ShareAlike 4.0](https://creativecommons.org/licenses/by-sa/4.0/).
36
+
37
+ ## Acknowledgments
38
+
39
+ This model was trained with GPU servers provided by [Okumura-Funakoshi NLP Group](https://lr-www.pi.titech.ac.jp).