oluwatosin adewumi commited on
Commit
bc3a45b
1 Parent(s): 60a5b1e

code fixed

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -24,7 +24,7 @@ More information about the original pre-trained model can be found [here](https:
24
  * Classification examples:
25
  |Prediction | Input |
26
  |---------|------------|
27
- |0 | "selective kindness : in europe , some refugees are more equal than others" |
28
  |1 | he said their efforts should not stop only at creating many graduates but also extended to students from poor families so that they could break away from the cycle of poverty |
29
 
30
  ### How to use
@@ -32,8 +32,8 @@ More information about the original pre-trained model can be found [here](https:
32
  ```python
33
  from transformers import T5ForConditionalGeneration, T5Tokenizer
34
  import torch
35
- tokenizer = T5Tokenizer.from_pretrained("tosin/pcl_22")
36
  model = T5ForConditionalGeneration.from_pretrained("tosin/pcl_22")
 
37
  tokenizer.pad_token = tokenizer.eos_token
38
  input_ids = tokenizer("he said their efforts should not stop only at creating many graduates but also extended to students from poor families so that they could break away from the cycle of poverty", padding=True, truncation=True, return_tensors='pt').input_ids
39
  outputs = model.generate(input_ids)
24
  * Classification examples:
25
  |Prediction | Input |
26
  |---------|------------|
27
+ |0 | selective kindness : in europe , some refugees are more equal than others |
28
  |1 | he said their efforts should not stop only at creating many graduates but also extended to students from poor families so that they could break away from the cycle of poverty |
29
 
30
  ### How to use
32
  ```python
33
  from transformers import T5ForConditionalGeneration, T5Tokenizer
34
  import torch
 
35
  model = T5ForConditionalGeneration.from_pretrained("tosin/pcl_22")
36
+ tokenizer = T5Tokenizer.from_pretrained("t5-base") # use the source tokenizer because T5 finetuned tokenizer breaks
37
  tokenizer.pad_token = tokenizer.eos_token
38
  input_ids = tokenizer("he said their efforts should not stop only at creating many graduates but also extended to students from poor families so that they could break away from the cycle of poverty", padding=True, truncation=True, return_tensors='pt').input_ids
39
  outputs = model.generate(input_ids)