fix broken links to llama-cpp-python

#2
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -146,7 +146,7 @@ For other parameters and how to use them, please refer to [the llama.cpp documen
146
 
147
  ## How to run in `text-generation-webui`
148
 
149
- Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
150
 
151
  ## How to run from Python code
152
 
@@ -154,7 +154,7 @@ You can use GGUF models from Python using the [llama-cpp-python](https://github.
154
 
155
  ### How to load this model in Python code, using llama-cpp-python
156
 
157
- For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
158
 
159
  #### First install the package
160
 
 
146
 
147
  ## How to run in `text-generation-webui`
148
 
149
+ Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20-%20Model%20Tab.md#llamacpp).
150
 
151
  ## How to run from Python code
152
 
 
154
 
155
  ### How to load this model in Python code, using llama-cpp-python
156
 
157
+ For full documentation, please see: [llama-cpp-python docs](https://github.com/abetlen/llama-cpp-python).
158
 
159
  #### First install the package
160