AlexWortega commited on
Commit
30ecc01
1 Parent(s): 5549d10

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -5
README.md CHANGED
@@ -18,9 +18,6 @@ ruGPT-Neo 1.3B is a transformer model designed using EleutherAI's replication of
18
 
19
  This model was trained on the wiki, gazeta summorization, for 38k steps, on 4*v100 gpu, still training . It was trained as a masked autoregressive language model, using cross-entropy loss.
20
 
21
- ## Intended Use and Limitations
22
-
23
- This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a prompt.
24
 
25
  ### How to use
26
 
@@ -29,7 +26,7 @@ You can use this model directly with a pipeline for text generation. This exampl
29
  ```py
30
  >>> from transformers import pipeline
31
  >>> generator = pipeline('text-generation', model='AlexWortega/rugpt-neo-1.3b')
32
- >>> generator("EleutherAI has", do_sample=True, min_length=50)
33
 
34
- [{'generated_text': 'EleutherAI has made a commitment to create new software packages for each of its major clients and has'}]
35
  ```
 
18
 
19
  This model was trained on the wiki, gazeta summorization, for 38k steps, on 4*v100 gpu, still training . It was trained as a masked autoregressive language model, using cross-entropy loss.
20
 
 
 
 
21
 
22
  ### How to use
23
 
 
26
  ```py
27
  >>> from transformers import pipeline
28
  >>> generator = pipeline('text-generation', model='AlexWortega/rugpt-neo-1.3b')
29
+ >>> generator("Как какать? Ответ:", do_sample=True, min_length=50)
30
 
31
+ [{'generated_text': 'Как какать? Ответ: Cпустите штаны и покакайте, затем воспользуйтесь бумагой'}]
32
  ```