qnguyen3 commited on
Commit
70ae998
1 Parent(s): 6625d89

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -43,8 +43,8 @@ This project is led by qnguyen3 and teknium.
43
 
44
  ## Usage
45
  ### Prompt Format
46
- - Like other LLaVA's variants, this model uses Vicuna-V1 as its prompt template. Please refer to `conv_llava_v1` in (this file)[https://github.com/qnguyen3/hermes-llava/blob/main/llava/conversation.py]
47
- - For Gradio UI, please visit this (GitHub Repo)[https://github.com/qnguyen3/hermes-llava]
48
 
49
  ### Function Calling
50
  - For functiong calling, the message should start with a `<fn_call>` tag. Here is an example:
 
43
 
44
  ## Usage
45
  ### Prompt Format
46
+ - Like other LLaVA's variants, this model uses Vicuna-V1 as its prompt template. Please refer to `conv_llava_v1` in [this file](https://github.com/qnguyen3/hermes-llava/blob/main/llava/conversation.py)
47
+ - For Gradio UI, please visit this [GitHub Repo](https://github.com/qnguyen3/hermes-llava)
48
 
49
  ### Function Calling
50
  - For functiong calling, the message should start with a `<fn_call>` tag. Here is an example: