oluwatosin adewumi commited on
Commit
55ead19
1 Parent(s): ca8a87b

README modified 2

Browse files
Files changed (1) hide show
  1. README.md +7 -0
README.md CHANGED
@@ -15,10 +15,14 @@ widget:
15
  ---
16
 
17
  ## DialoGPT_MWOZ_Idioms
 
18
  This is a fine-tuned model of DialoGPT (medium)-MultiWOZ on the PIE-English idioms corpus. It is intended to be used as an idiom-aware conversational system.
19
  The dataset it's trained on is limited in scope, as it covers only 10 classes of idioms ( metaphor, simile, euphemism, parallelism, personification, oxymoron, paradox, hyperbole, irony and literal).
 
20
  The perplexity achieved on the test and validation sets are 160.9 and 160.46, respectively. More information about the original pre-trained model can be found [here](https://huggingface.co/tosin/dialogpt_mwoz)
 
21
  * Generation example from an interactive environment:
 
22
  |Role | Response |
23
  |---------|------------|
24
  |User | remind her to take it off the hook. |
@@ -27,9 +31,12 @@ The perplexity achieved on the test and validation sets are 160.9 and 160.46, re
27
  |Bot | they are under a cloud because of fears about US President's wife Hillary Clinton |
28
  |User | Some families breed like rabbits and send us all their male heirs |
29
  |Bot | he was out for the count |
 
30
  Please find the information about preprocessing, training and full details of the DialoGPT in the [original DialoGPT repository](https://github.com/microsoft/DialoGPT)
 
31
  ### How to use
32
  Now we are ready to try out how the model works as a chatting partner!
 
33
  ```python
34
  from transformers import AutoModelForCausalLM, AutoTokenizer
35
  import torch
15
  ---
16
 
17
  ## DialoGPT_MWOZ_Idioms
18
+
19
  This is a fine-tuned model of DialoGPT (medium)-MultiWOZ on the PIE-English idioms corpus. It is intended to be used as an idiom-aware conversational system.
20
  The dataset it's trained on is limited in scope, as it covers only 10 classes of idioms ( metaphor, simile, euphemism, parallelism, personification, oxymoron, paradox, hyperbole, irony and literal).
21
+
22
  The perplexity achieved on the test and validation sets are 160.9 and 160.46, respectively. More information about the original pre-trained model can be found [here](https://huggingface.co/tosin/dialogpt_mwoz)
23
+
24
  * Generation example from an interactive environment:
25
+
26
  |Role | Response |
27
  |---------|------------|
28
  |User | remind her to take it off the hook. |
31
  |Bot | they are under a cloud because of fears about US President's wife Hillary Clinton |
32
  |User | Some families breed like rabbits and send us all their male heirs |
33
  |Bot | he was out for the count |
34
+
35
  Please find the information about preprocessing, training and full details of the DialoGPT in the [original DialoGPT repository](https://github.com/microsoft/DialoGPT)
36
+
37
  ### How to use
38
  Now we are ready to try out how the model works as a chatting partner!
39
+
40
  ```python
41
  from transformers import AutoModelForCausalLM, AutoTokenizer
42
  import torch