jesseD commited on
Commit
bbd1433
1 Parent(s): 865370a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -1
README.md CHANGED
@@ -5,4 +5,45 @@ language:
5
  - en
6
  ---
7
 
8
- This literally is the worst model ever :(...
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  - en
6
  ---
7
 
8
+ # HomerBot: A conversational chatbot imitating Homer Simpson
9
+
10
+ This model is a fine-tuned [DialoGPT](https://huggingface.co/microsoft/DialoGPT-medium) (medium version) on Simpsons [scripts](https://www.kaggle.com/datasets/pierremegret/dialogue-lines-of-the-simpsons).
11
+
12
+ More specifically, we fine-tune DialoGPT-medium for 3 epochs on 10K **(character utterance, Homer's response)** pairs
13
+
14
+ For more details, check out our git [repo](https://github.com/jesseDingley/HomerBot) containing all the code.
15
+
16
+ ### How to use
17
+
18
+
19
+ ```python
20
+ from transformers import AutoModelForCausalLM, AutoTokenizer
21
+ import torch
22
+
23
+ tokenizer = AutoTokenizer.from_pretrained("DingleyMaillotUrgell/homer-bot")
24
+ model = AutoModelForCausalLM.from_pretrained("DingleyMaillotUrgell/homer-bot")
25
+
26
+ # Let's chat for 5 lines
27
+ for step in range(5):
28
+ # encode the new user input, add the eos_token and return a tensor in Pytorch
29
+ new_user_input_ids = tokenizer.encode(input(">> User: ") + tokenizer.eos_token, return_tensors='pt')
30
+
31
+ # append the new user input tokens to the chat history
32
+ bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if step > 0 else new_user_input_ids
33
+
34
+ # generated a response while limiting the total chat history to 1000 tokens,
35
+ chat_history_ids = model.generate(
36
+ bot_input_ids,
37
+ max_length=1000,
38
+ pad_token_id=tokenizer.eos_token_id,
39
+ no_repeat_ngram_size=3,
40
+ do_sample=True,
41
+ top_k=100,
42
+ top_p=0.7,
43
+ temperature = 0.8
44
+ )
45
+
46
+ # print last outpput tokens from bot
47
+ print("Homer: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))
48
+ ```
49
+