Deeokay commited on
Commit
93213d2
1 Parent(s): 4d26d05

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -3
README.md CHANGED
@@ -86,8 +86,7 @@ Please feel free to customize the Modelfile, and if you do get a better response
86
  If would like to know how I started creating my dataset, you can check this link
87
  [Crafting GPT2 for Personalized AI-Preparing Data the Long Way (Part1)](https://medium.com/@deeokay/the-soul-in-the-machine-crafting-gpt2-for-personalized-ai-9d38be3f635f)
88
 
89
- As the data was getting created with custom GPT2 special tokens, I had to convert that to the a Alpaca Template.
90
- However I got creative again.. the training data has the following Template:
91
 
92
  ```python
93
  special_tokens_dict = {
@@ -110,7 +109,7 @@ tokenizer.pad_token_id = tokenizer.convert_tokens_to_ids('<|PAD|>')
110
 
111
  ```
112
 
113
- The data is in the following format:
114
 
115
  ```python
116
  def combine_text(user_prompt, analysis, sentiment, new_response, classification):
 
86
  If would like to know how I started creating my dataset, you can check this link
87
  [Crafting GPT2 for Personalized AI-Preparing Data the Long Way (Part1)](https://medium.com/@deeokay/the-soul-in-the-machine-crafting-gpt2-for-personalized-ai-9d38be3f635f)
88
 
89
+ ## The training data has the following Template:
 
90
 
91
  ```python
92
  special_tokens_dict = {
 
109
 
110
  ```
111
 
112
+ ## The data is in the following format:
113
 
114
  ```python
115
  def combine_text(user_prompt, analysis, sentiment, new_response, classification):