seanghay commited on
Commit
057d46b
1 Parent(s): 8cadda8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -5
README.md CHANGED
@@ -1,14 +1,18 @@
1
  ---
2
- license: mit
3
  tags:
4
  - generated_from_trainer
5
  datasets:
6
- - kh_pos
7
  metrics:
8
  - precision
9
  - recall
10
  - f1
11
  - accuracy
 
 
 
 
12
  model-index:
13
  - name: khmer-pos-roberta-10
14
  results:
@@ -34,14 +38,18 @@ model-index:
34
  - name: Accuracy
35
  type: accuracy
36
  value: 0.9735370853522176
 
 
 
 
37
  ---
38
 
39
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
40
  should probably proofread and complete it, then remove this comment. -->
41
 
42
- # khmer-pos-roberta-10
43
 
44
- This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the kh_pos dataset.
45
  It achieves the following results on the evaluation set:
46
  - Loss: 0.1063
47
  - Precision: 0.9512
@@ -95,4 +103,4 @@ The following hyperparameters were used during training:
95
  - Transformers 4.30.2
96
  - Pytorch 2.0.1+cu118
97
  - Datasets 2.13.1
98
- - Tokenizers 0.13.3
 
1
  ---
2
+ license: apache-2.0
3
  tags:
4
  - generated_from_trainer
5
  datasets:
6
+ - seanghay/khPOS
7
  metrics:
8
  - precision
9
  - recall
10
  - f1
11
  - accuracy
12
+ widget:
13
+ - text: គាត់ផឹកទឹកនៅភ្នំពេញ
14
+ - text: តើលោកស្រីបានសាកសួរទៅគាត់ទេ?
15
+ - text: នេត្រា មិនដឹងសោះថាអ្នកជាមនុស្ស!
16
  model-index:
17
  - name: khmer-pos-roberta-10
18
  results:
 
38
  - name: Accuracy
39
  type: accuracy
40
  value: 0.9735370853522176
41
+ language:
42
+ - km
43
+ library_name: transformers
44
+ pipeline_tag: token-classification
45
  ---
46
 
47
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
48
  should probably proofread and complete it, then remove this comment. -->
49
 
50
+ # Khmer Part of Speech Tagging with XLM RoBERTa
51
 
52
+ This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the [khPOS](https://huggingface.co/seanghay/khPOS) dataset.
53
  It achieves the following results on the evaluation set:
54
  - Loss: 0.1063
55
  - Precision: 0.9512
 
103
  - Transformers 4.30.2
104
  - Pytorch 2.0.1+cu118
105
  - Datasets 2.13.1
106
+ - Tokenizers 0.13.3