Stojanco Tudzarski commited on
Commit
8b57399
1 Parent(s): e51eef1

times-mk-news-2010-20115 added

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -7,6 +7,7 @@ tags:
7
  license: Apache 2.0
8
  datasets:
9
  - wiki-mk
 
10
  ---
11
 
12
  # MK-RoBERTa base model
@@ -21,7 +22,6 @@ This way, the model learns an inner representation of the English language that
21
 
22
  # Intended uses & limitations
23
  You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions of a task that interests you.
24
-
25
  Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification, or question answering. For tasks such as text generation, you should look at models like GPT2.
26
 
27
  # How to use
 
7
  license: Apache 2.0
8
  datasets:
9
  - wiki-mk
10
+ - time-mk-news-2010-2015
11
  ---
12
 
13
  # MK-RoBERTa base model
 
22
 
23
  # Intended uses & limitations
24
  You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions of a task that interests you.
 
25
  Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification, or question answering. For tasks such as text generation, you should look at models like GPT2.
26
 
27
  # How to use