Fill-Mask
Transformers
PyTorch
Chinese
bert
Inference Endpoints
wangyulong commited on
Commit
2ec65a8
1 Parent(s): cc9c507

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -24,7 +24,7 @@ model = BertModel.from_pretrained("Langboat/mengzi-bert-base")
24
  ## Scores on nine chinese tasks (without any data augmentation)
25
  |Model|AFQMC|TNEWS|IFLYTEK|CMNLI|WSC|CSL|CMRC|C3|CHID|
26
  |-|-|-|-|-|-|-|-|-|-|
27
- |CLUE RoBERTa-wwm-ext Baseline|74.04|56.94|60.31|80.51|67.80|81.00|75.20|66.50|83.62|
28
  |Mengzi-BERT-base|74.58|57.97|60.68|82.12|87.50|85.40|78.54|71.70|84.16|
29
 
30
  ## Citation
 
24
  ## Scores on nine chinese tasks (without any data augmentation)
25
  |Model|AFQMC|TNEWS|IFLYTEK|CMNLI|WSC|CSL|CMRC|C3|CHID|
26
  |-|-|-|-|-|-|-|-|-|-|
27
+ |RoBERTa-wwm-ext(CLUE Baseline)|74.04|56.94|60.31|80.51|67.80|81.00|75.20|66.50|83.62|
28
  |Mengzi-BERT-base|74.58|57.97|60.68|82.12|87.50|85.40|78.54|71.70|84.16|
29
 
30
  ## Citation