Sharathhebbar24 commited on
Commit
d275446
1 Parent(s): 5f7bf58

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -24,7 +24,7 @@ This is a fine-tuned version of https://huggingface.co/t5-small
24
  >>> summary = tokenizer.decode(outputs[0], skip_special_tokens=True)
25
  >>> return summary
26
 
27
- >>> generate_text("Should I Invest in stocks")
28
 
29
  It's good
30
  ```
@@ -52,7 +52,7 @@ It's good
52
 
53
  The developers of the Text-To-Text Transfer Transformer (T5) [write](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html):
54
 
55
- > With T5, we propose reframing all NLP tasks into a unified text-to-text-format where the input and output are always text strings, in contrast to BERT-style models that can only output either a class label or a span of the input. Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task.
56
 
57
  T5-Small is the checkpoint with 60 million parameters.
58
 
@@ -79,15 +79,15 @@ See the [blog post](https://ai.googleblog.com/2020/02/exploring-transfer-learnin
79
 
80
  ## Out-of-Scope Use
81
 
82
- More information needed.
83
 
84
  # Bias, Risks, and Limitations
85
 
86
- More information needed.
87
 
88
  ## Recommendations
89
 
90
- More information needed.
91
 
92
  # Training Details
93
 
 
24
  >>> summary = tokenizer.decode(outputs[0], skip_special_tokens=True)
25
  >>> return summary
26
 
27
+ >>> generate_text("Android 3 Stars!!")
28
 
29
  It's good
30
  ```
 
52
 
53
  The developers of the Text-To-Text Transfer Transformer (T5) [write](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html):
54
 
55
+ > With T5, we propose reframing all NLP tasks into a unified text-to-text format where the input and output are always text strings, in contrast to BERT-style models that can only output either a class label or a span of the input. Our text-to-text framework allows us to use the same model, loss function, and hyperparameters on any NLP task.
56
 
57
  T5-Small is the checkpoint with 60 million parameters.
58
 
 
79
 
80
  ## Out-of-Scope Use
81
 
82
+ More information is needed.
83
 
84
  # Bias, Risks, and Limitations
85
 
86
+ More information is needed.
87
 
88
  ## Recommendations
89
 
90
+ More information is needed.
91
 
92
  # Training Details
93