Kwaku commited on
Commit
8820883
1 Parent(s): ea378bc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -11,7 +11,7 @@ This is a fine-tuned version of the GPT2 model. It's best suited for text-genera
11
  gpt2-finetuned-ko was fine tuned on the [banking77](https://huggingface.co/datasets/banking77) dataset, which is "composed of online banking queries annotated with their corresponding intents."
12
 
13
  ## Intended Uses and Limitations
14
- Given the hugeness of the [Microsoft DialoGPT-large](https://huggingface.co/microsoft/DialoGPT-large) model, the author resorted to fine-tuning the gpt2 model for the creation of a chatbot. The intent was for the chatbot to emulate a banking customer agent, hence the use of the banking77 dataset. However, when the fine-tuned model was deployed in the chatbot, the results were undesirable. Its responses were inappropriate, unnecessarily long and repetitive. The model performs better in text-generation but is prone to generate baking-related texted because of the corpus it was trained on.
15
 
16
  ### How to use
17
 
 
11
  gpt2-finetuned-ko was fine tuned on the [banking77](https://huggingface.co/datasets/banking77) dataset, which is "composed of online banking queries annotated with their corresponding intents."
12
 
13
  ## Intended Uses and Limitations
14
+ Given the magnitude of the [Microsoft DialoGPT-large](https://huggingface.co/microsoft/DialoGPT-large) model, the author resorted to fine-tuning the gpt2 model for the creation of a chatbot. The intent was for the chatbot to emulate a banking customer agent, hence the use of the banking77 dataset. However, when the fine-tuned model was deployed in the chatbot, the results were undesirable. Its responses were inappropriate and unnecessarily long. The last word of its response is repeated numerously, a major glitch in it. The model performs better in text-generation but is prone to generating banking-related text because of the corpus it was trained on.
15
 
16
  ### How to use
17