tianjunz commited on
Commit
673a424
1 Parent(s): 6225f84

Update README.md

Browse files

Update the format of the model chatting

Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -185,10 +185,11 @@ def get_prompt(user_query: str, functions: list = []) -> str:
185
  Returns:
186
  - str: The formatted conversation prompt.
187
  """
 
188
  if len(functions) == 0:
189
- return f"USER: <<question>> {user_query}\nASSISTANT: "
190
  functions_string = json.dumps(functions)
191
- return f"USER: <<question>> {user_query} <<function>> {functions_string}\nASSISTANT: "
192
  ```
193
 
194
  Further, here is how we format the response:
 
185
  Returns:
186
  - str: The formatted conversation prompt.
187
  """
188
+ system = "You are an AI programming assistant, utilizing the Gorilla LLM model, developed by Gorilla LLM, and you only answer questions related to computer science. For politically sensitive questions, security and privacy issues, and other non-computer science questions, you will refuse to answer."
189
  if len(functions) == 0:
190
+ return f"{system}\n### Instruction: <<question>> {user_query}\n### Response: "
191
  functions_string = json.dumps(functions)
192
+ return f"{system}\n### Instruction: <<function>>{functions_string}\n<<question>>{user_query}\### Response: "
193
  ```
194
 
195
  Further, here is how we format the response: