tuner007 commited on
Commit
caf6216
•
1 Parent(s): 7af080b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -24
README.md CHANGED
@@ -1,5 +1,15 @@
1
- # Pegasus for Paraphrasing
2
- Pegasus model fine-tuned for paraphrasing
 
 
 
 
 
 
 
 
 
 
3
 
4
  ## Model in Action 🚀
5
  ```
@@ -10,16 +20,18 @@ torch_device = 'cuda' if torch.cuda.is_available() else 'cpu'
10
  tokenizer = PegasusTokenizer.from_pretrained(model_name)
11
  model = PegasusForConditionalGeneration.from_pretrained(model_name).to(torch_device)
12
 
13
- def get_response(input_text,num_return_sequences):
14
- batch = tokenizer.prepare_seq2seq_batch([input_text],truncation=True,padding='longest',max_length=60, return_tensors="pt").to(torch_device)
15
- translated = model.generate(**batch,max_length=60,num_beams=10, num_return_sequences=num_return_sequences, temperature=1.5)
16
  tgt_text = tokenizer.batch_decode(translated, skip_special_tokens=True)
17
  return tgt_text
18
  ```
19
- #### Example 1:
20
  ```
 
 
21
  context = "The ultimate test of your knowledge is your capacity to convey it to another."
22
- get_response(context,10)
23
  # output:
24
  ['The test of your knowledge is your ability to convey it.',
25
  'The ability to convey your knowledge is the ultimate test of your knowledge.',
@@ -32,22 +44,6 @@ get_response(context,10)
32
  'The test of your knowledge is how well you can convey it.',
33
  'Your capacity to convey your knowledge is the ultimate test.']
34
  ```
35
- #### Example 2: Question paraphrasing (was not trained on quora dataset)
36
- ```
37
- context = "Which course should I take to get started in data science?"
38
- get_response(context,10)
39
- # output:
40
- ['Which data science course should I take?',
41
- 'Which data science course should I take first?',
42
- 'Should I take a data science course?',
43
- 'Which data science class should I take?',
44
- 'Which data science course should I attend?',
45
- 'I want to get started in data science.',
46
- 'Which data science course should I enroll in?',
47
- 'Which data science course is right for me?',
48
- 'Which data science course is best for me?',
49
- 'Which course should I take to get started?']
50
- ```
51
 
52
- > Created by Arpit Rajauria
53
  [![Twitter icon](https://cdn0.iconfinder.com/data/icons/shift-logotypes/32/Twitter-32.png)](https://twitter.com/arpit_rajauria)
1
+ ---
2
+ language: en
3
+ license: apache-2.0
4
+ tags:
5
+ - pegasus
6
+ - paraphrasing
7
+ - seq2seq
8
+ pipeline_tag: text2text-generation
9
+ ---
10
+
11
+ ## Model description
12
+ [PEGASUS](https://github.com/google-research/pegasus) fine-tuned for paraphrasing
13
 
14
  ## Model in Action 🚀
15
  ```
20
  tokenizer = PegasusTokenizer.from_pretrained(model_name)
21
  model = PegasusForConditionalGeneration.from_pretrained(model_name).to(torch_device)
22
 
23
+ def get_response(input_text,num_return_sequences,num_beams):
24
+ batch = tokenizer([input_text],truncation=True,padding='longest',max_length=60, return_tensors="pt").to(torch_device)
25
+ translated = model.generate(**batch,max_length=60,num_beams=num_beams, num_return_sequences=num_return_sequences, temperature=1.5)
26
  tgt_text = tokenizer.batch_decode(translated, skip_special_tokens=True)
27
  return tgt_text
28
  ```
29
+ #### Example:
30
  ```
31
+ num_beams = 10
32
+ num_return_sequences = 10
33
  context = "The ultimate test of your knowledge is your capacity to convey it to another."
34
+ get_response(context,num_return_sequences,num_beams)
35
  # output:
36
  ['The test of your knowledge is your ability to convey it.',
37
  'The ability to convey your knowledge is the ultimate test of your knowledge.',
44
  'The test of your knowledge is how well you can convey it.',
45
  'Your capacity to convey your knowledge is the ultimate test.']
46
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
 
48
+ > Created by [Arpit Rajauria](https://twitter.com/arpit_rajauria)
49
  [![Twitter icon](https://cdn0.iconfinder.com/data/icons/shift-logotypes/32/Twitter-32.png)](https://twitter.com/arpit_rajauria)