toastynews commited on
Commit
52f319d
1 Parent(s): dc52ea7

Pushing model manually

Browse files
Files changed (1) hide show
  1. README.md +56 -0
README.md CHANGED
@@ -1,3 +1,59 @@
1
  ---
 
2
  license: apache-2.0
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language: yue
3
  license: apache-2.0
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: electra-hongkongese-small-hk-ws
8
+ results: []
9
  ---
10
+
11
+ # electra-hongkongese-small-hk-ws
12
+
13
+ This model is a fine-tuned version of [toastynews/electra-hongkongese-small-discriminator](https://huggingface.co/toastynews/electra-hongkongese-small-discriminator) on [HKCanCor](https://pycantonese.org/data.html#built-in-data) and [CityU](http://sighan.cs.uchicago.edu/bakeoff2005/) for word segmentation.
14
+
15
+ ## Model description
16
+
17
+ Performs word segmentation on text from Hong Kong.
18
+ There are two versions; hk trained with only text from Hong Kong, and hkt trained with text from Hong Kong and Taiwan. Each version have base and small model sizes.
19
+
20
+ ## Intended uses & limitations
21
+
22
+ Trained to handle both Hongkongese/Cantonese and Standard Chinese from Hong Kong. Text from other places and English do not work as well.
23
+ The easiest way is to use with the CKIP Transformers libary.
24
+
25
+ ## Training and evaluation data
26
+
27
+ HKCanCor and CityU are converted to BI-encoded word segmentation dataset in Hugging Face format using code from [finetune-ckip-transformers](https://github.com/toastynews/finetune-ckip-transformers).
28
+
29
+ ## Training procedure
30
+
31
+ ### Training hyperparameters
32
+
33
+ The following hyperparameters were used during training:
34
+ - learning_rate: 5e-05
35
+ - train_batch_size: 8
36
+ - eval_batch_size: 8
37
+ - seed: 42
38
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
39
+ - lr_scheduler_type: linear
40
+ - num_epochs: 3.0
41
+
42
+ ### Training results
43
+
44
+ |dataset |token_f |token_p |token_r |
45
+ |:---------|--------|--------|--------|
46
+ |ud yue_hk | 0.9468| 0.9484| 0.9453|
47
+ |ud zh_hk | 0.9277| 0.9350| 0.9205|
48
+ |_hkcancor_|_0.9769_|_0.9742_|_0.9795_|
49
+ |cityu | 0.9750| 0.9741| 0.9760|
50
+ |as | 0.9187| 0.9154| 0.9219|
51
+
52
+ _Was trained on hkcancor. Reported for reference only._
53
+
54
+ ### Framework versions
55
+
56
+ - Transformers 4.27.0.dev0
57
+ - Pytorch 1.10.0
58
+ - Datasets 2.10.0
59
+ - Tokenizers 0.13.2