Sean Halpin commited on
Commit
a618928
1 Parent(s): 90910b4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +109 -1
README.md CHANGED
@@ -5,6 +5,12 @@ tags:
5
  license: mit
6
  ---
7
 
 
 
 
 
 
 
8
  ## A State-of-the-Art Large-scale Pretrained Response generation model (DialoGPT)
9
 
10
  DialoGPT is a SOTA large-scale pretrained dialogue response generation model for multiturn conversations.
@@ -26,7 +32,9 @@ Please find the information about preprocessing, training and full details of th
26
 
27
  ArXiv paper: [https://arxiv.org/abs/1911.00536](https://arxiv.org/abs/1911.00536)
28
 
29
- ### How to use
 
 
30
 
31
  Now we are ready to try out how the model works as a chatting partner!
32
 
@@ -51,4 +59,104 @@ for step in range(5):
51
 
52
  # pretty print last ouput tokens from bot
53
  print("DialoG-PT-HomerBot: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
54
  ```
 
5
  license: mit
6
  ---
7
 
8
+ ## dialogGPT-homer-simpson
9
+
10
+ This model has been fine tuned with the entire scripts of Homer Simpson from the T.V. show The Simpsons
11
+ It will give some nice answers seemingly from Homers brain in the Simpsons Universe during single turn conversation.
12
+ See the
13
+
14
  ## A State-of-the-Art Large-scale Pretrained Response generation model (DialoGPT)
15
 
16
  DialoGPT is a SOTA large-scale pretrained dialogue response generation model for multiturn conversations.
 
32
 
33
  ArXiv paper: [https://arxiv.org/abs/1911.00536](https://arxiv.org/abs/1911.00536)
34
 
35
+ ### How to use Multi-Turn
36
+
37
+ NOTE: Multi-Turn seems to be broken, after a few exchanges the output will mostly be exclamation marks.
38
 
39
  Now we are ready to try out how the model works as a chatting partner!
40
 
 
59
 
60
  # pretty print last ouput tokens from bot
61
  print("DialoG-PT-HomerBot: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))
62
+ ```
63
+
64
+ ### How to use Single Turn
65
+
66
+ ```python
67
+ from transformers import AutoModelForCausalLM, AutoTokenizer
68
+ import torch
69
+
70
+
71
+ tokenizer = AutoTokenizer.from_pretrained("shalpin87/dialoGPT-homer-simpson")
72
+ model = AutoModelForCausalLM.from_pretrained("shalpin87/dialoGPT-homer-simpson")
73
+
74
+ questions = [
75
+ "What is your name?",
76
+ "Who are you?",
77
+ "Where do you work?",
78
+ "Who really killed Mr Burns?",
79
+ "Have you ever stolen from the Kwik-E-Mart?",
80
+ "Did you kill Frank Grimes?",
81
+ "Who was the worst member of the Be Sharps?",
82
+ "Hey where did Barney go?",
83
+ "What is your favorite bar to have a beer?",
84
+ "What is the best beer in Springfield?",
85
+ "Is Bart working for the Mob?",
86
+ "I think there was an incident in sector 7 G",
87
+ "Is Ned Flanders house okay?",
88
+ "Oh my god it's Sideshow Bob",
89
+ "What is a Flaming Moe?",
90
+ "What is happening to Apu?",
91
+ "Who quit the band?"
92
+ ]
93
+
94
+ # Let's chat
95
+ for step in range(len(questions)):
96
+ print("***************************************")
97
+ print("Q. {}".format(questions[step]))
98
+ new_user_input_ids = tokenizer.encode(questions[step] + tokenizer.eos_token, return_tensors='pt')
99
+ # new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt')
100
+
101
+ bot_input_ids = new_user_input_ids
102
+
103
+ num_return_seqs=1
104
+
105
+ chat_history_ids = model.generate(
106
+ bot_input_ids,
107
+ max_length=200,
108
+ pad_token_id=tokenizer.eos_token_id,
109
+ no_repeat_ngram_size=3,
110
+ do_sample=True,
111
+ top_k=200,
112
+ top_p=0.55,
113
+ temperature = 0.85,
114
+ num_return_sequences=num_return_seqs
115
+ )
116
+
117
+ botname = "HomerBot"
118
+ for i in range(0,num_return_seqs):
119
+ print("{}:{}: {}".format(i, botname, tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][i], skip_special_tokens=True)))
120
+
121
+ chat_history_ids = []
122
+ ```
123
+
124
+ Sample Output
125
+ ```
126
+ ***************************************
127
+ Q. What is your name?
128
+ 0:HomerBot: Homer Simpson.
129
+ ***************************************
130
+ Q. Who are you?
131
+ 0:HomerBot: Homer Simpson.
132
+ ***************************************
133
+ Q. Where do you work?
134
+ 0:HomerBot: I work at a nuclear power plant.
135
+ ***************************************
136
+ Q. Who really killed Mr Burns?
137
+ 0:HomerBot: I dunno. Some guy named Ot-to.
138
+ ***************************************
139
+ Q. Have you ever stolen from the Kwik-E-Mart?
140
+ 0:HomerBot: No.
141
+ ***************************************
142
+ Q. Did you kill Frank Grimes?
143
+ 0:HomerBot: No.
144
+ ***************************************
145
+ Q. Who was the worst member of the Be Sharps?
146
+ 0:HomerBot: Homer Simpson, sir.
147
+ ***************************************
148
+ Q. Hey where did Barney go?
149
+ 0:HomerBot: He's been eating that thing for a week.
150
+ ***************************************
151
+ Q. What is your favorite bar to have a beer?
152
+ 0:HomerBot: The Flanders' Club.
153
+ ***************************************
154
+ Q. What is the best beer in Springfield?
155
+ 0:HomerBot: Pepi!
156
+ ***************************************
157
+ Q. Is Bart working for the Mob?
158
+ 0:HomerBot: I think he's trying to tell us something.
159
+ ***************************************
160
+ Q. I think there was an incident in sector 7 G
161
+ 0:HomerBot: Uh, let's not jump to any conclusions. The dog didn't run away.
162
  ```