Fill-Mask
Transformers
PyTorch
xlm-roberta
Inference Endpoints
fenchri commited on
Commit
9a1c8c7
1 Parent(s): 8c2e947

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -46,7 +46,7 @@ language:
46
 
47
  # Model Card for EntityCS-39-MLM-xlmr-base
48
 
49
- This model has been trained on the EntityCS corpus, a multilingual corpus from Wikipedia with replaces entities in different languages.
50
  The corpus can be found in [https://huggingface.co/huawei-noah/entity_cs](https://huggingface.co/huawei-noah/entity_cs), check the link for more details.
51
 
52
  Firstly, we employ the conventional 80-10-10 MLM objective, where 15% of sentence subwords are considered as masking candidates. From those, we replace subwords
 
46
 
47
  # Model Card for EntityCS-39-MLM-xlmr-base
48
 
49
+ This model has been trained on the EntityCS corpus, an English corpus from Wikipedia with replaced entities in different languages.
50
  The corpus can be found in [https://huggingface.co/huawei-noah/entity_cs](https://huggingface.co/huawei-noah/entity_cs), check the link for more details.
51
 
52
  Firstly, we employ the conventional 80-10-10 MLM objective, where 15% of sentence subwords are considered as masking candidates. From those, we replace subwords