gugarosa commited on
Commit
fc313b0
1 Parent(s): 810c2e6

fix(root): Replaces system by user to improve generation experience.

Browse files
Files changed (2) hide show
  1. README.md +0 -5
  2. tokenizer_config.json +1 -1
README.md CHANGED
@@ -70,8 +70,6 @@ You can provide the prompt as a question with a generic template as follow:
70
  ```
71
  For example:
72
  ```markdown
73
- <|system|>
74
- You are a helpful AI assistant.<|end|>
75
  <|user|>
76
  How to explain Internet for a medieval knight?<|end|>
77
  <|assistant|>
@@ -80,8 +78,6 @@ How to explain Internet for a medieval knight?<|end|>
80
  where the model generates the text after `<|assistant|>`. In case of few-shots prompt, the prompt can be formatted as the following:
81
 
82
  ```markdown
83
- <|system|>
84
- You are a helpful AI assistant.<|end|>
85
  <|user|>
86
  I am going to Paris, what should I see?<|end|>
87
  <|assistant|>
@@ -110,7 +106,6 @@ model = AutoModelForCausalLM.from_pretrained(
110
  tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-128k-instruct")
111
 
112
  messages = [
113
- {"role": "system", "content": "You are a helpful digital assistant. Please provide safe, ethical and accurate information to the user."},
114
  {"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"},
115
  {"role": "assistant", "content": "Sure! Here are some ways to eat bananas and dragonfruits together: 1. Banana and dragonfruit smoothie: Blend bananas and dragonfruits together with some milk and honey. 2. Banana and dragonfruit salad: Mix sliced bananas and dragonfruits together with some lemon juice and honey."},
116
  {"role": "user", "content": "What about solving an 2x + 3 = 7 equation?"},
 
70
  ```
71
  For example:
72
  ```markdown
 
 
73
  <|user|>
74
  How to explain Internet for a medieval knight?<|end|>
75
  <|assistant|>
 
78
  where the model generates the text after `<|assistant|>`. In case of few-shots prompt, the prompt can be formatted as the following:
79
 
80
  ```markdown
 
 
81
  <|user|>
82
  I am going to Paris, what should I see?<|end|>
83
  <|assistant|>
 
106
  tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-128k-instruct")
107
 
108
  messages = [
 
109
  {"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"},
110
  {"role": "assistant", "content": "Sure! Here are some ways to eat bananas and dragonfruits together: 1. Banana and dragonfruit smoothie: Blend bananas and dragonfruits together with some milk and honey. 2. Banana and dragonfruit salad: Mix sliced bananas and dragonfruits together with some lemon juice and honey."},
111
  {"role": "user", "content": "What about solving an 2x + 3 = 7 equation?"},
tokenizer_config.json CHANGED
@@ -116,7 +116,7 @@
116
  }
117
  },
118
  "bos_token": "<s>",
119
- "chat_template": "{{ bos_token }}{% for message in messages %}{% if (message['role'] == 'system') %}{{'<|system|>' + '\n' + message['content'] + '<|end|>' + '\n'}}{% elif (message['role'] == 'user') %}{{'<|user|>' + '\n' + message['content'] + '<|end|>' + '\n' + '<|assistant|>' + '\n'}}{% elif message['role'] == 'assistant' %}{{message['content'] + '<|end|>' + '\n'}}{% endif %}{% endfor %}",
120
  "clean_up_tokenization_spaces": false,
121
  "eos_token": "<|endoftext|>",
122
  "legacy": false,
 
116
  }
117
  },
118
  "bos_token": "<s>",
119
+ "chat_template": "{{ bos_token }}{% for message in messages %}{% if (message['role'] == 'user') %}{{'<|user|>' + '\n' + message['content'] + '<|end|>' + '\n' + '<|assistant|>' + '\n'}}{% elif (message['role'] == 'assistant') %}{{message['content'] + '<|end|>' + '\n'}}{% endif %}{% endfor %}",
120
  "clean_up_tokenization_spaces": false,
121
  "eos_token": "<|endoftext|>",
122
  "legacy": false,