nielsr HF staff commited on
Commit
4b7e4ee
1 Parent(s): 74e4aa2

Fix model card

Browse files
Files changed (1) hide show
  1. README.md +1 -5
README.md CHANGED
@@ -20,7 +20,7 @@ XLNet is a new unsupervised language representation learning method based on a n
20
 
21
  ## Intended uses & limitations
22
 
23
- You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlm-roberta) to look for fine-tuned versions on a task that interests you.
24
 
25
  Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
26
 
@@ -62,7 +62,3 @@ last_hidden_states = outputs.last_hidden_state
62
  bibsource = {dblp computer science bibliography, https://dblp.org}
63
  }
64
  ```
65
-
66
- <a href="https://huggingface.co/exbert/?model=xlm-roberta-base">
67
- <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
68
- </a>
20
 
21
  ## Intended uses & limitations
22
 
23
+ The model is mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlnet) to look for fine-tuned versions on a task that interests you.
24
 
25
  Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
26
 
62
  bibsource = {dblp computer science bibliography, https://dblp.org}
63
  }
64
  ```