JosephusCheung commited on
Commit
66077eb
1 Parent(s): 1c5e97a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -2
README.md CHANGED
@@ -9,8 +9,6 @@ tags:
9
  ---
10
  **Please read me! To use the GGUF from this repo, please use latest llama.cpp with pr [#4283](https://github.com/ggerganov/llama.cpp/pull/4283) merged.**
11
 
12
- New version llama.cpp wheel for text-generation-webui is building, to be updated soon
13
-
14
  # Uncensored, white-labeled... Compatible with Meta LLaMA 2.
15
 
16
  This is **not in Qwen Format**, but in **LLaMA format**.
@@ -31,4 +29,33 @@ cat 72b-q5_k_m.gguf-split-a 72b-q5_k_m.gguf-split-b > 72b-q5_k_m.gguf
31
  windows
32
  ```cmd
33
  copy /b 72b-q5_k_m.gguf-split-a + 72b-q5_k_m.gguf-split-b 72b-q5_k_m.gguf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
34
  ```
 
9
  ---
10
  **Please read me! To use the GGUF from this repo, please use latest llama.cpp with pr [#4283](https://github.com/ggerganov/llama.cpp/pull/4283) merged.**
11
 
 
 
12
  # Uncensored, white-labeled... Compatible with Meta LLaMA 2.
13
 
14
  This is **not in Qwen Format**, but in **LLaMA format**.
 
29
  windows
30
  ```cmd
31
  copy /b 72b-q5_k_m.gguf-split-a + 72b-q5_k_m.gguf-split-b 72b-q5_k_m.gguf
32
+ ```
33
+
34
+ ## How to update your text-generation-webui
35
+
36
+ Before their official update, you can install the latest version manually.
37
+
38
+ 1. check your current version first
39
+ for example:
40
+ ```bash
41
+ pip show llama_cpp_python_cuda
42
+ ```
43
+
44
+ ```
45
+ Name: llama_cpp_python_cuda
46
+ Version: 0.2.19+cu121
47
+ Summary: Python bindings for the llama.cpp library
48
+ Home-page:
49
+ Author:
50
+ Author-email: Andrei Betlen <abetlen@gmail.com>
51
+ License: MIT
52
+ Location: /usr/local/lib/python3.9/dist-packages
53
+ Requires: diskcache, numpy, typing-extensions
54
+ ```
55
+
56
+ 2. Then install from here: https://github.com/CausalLM/llama-cpp-python-cuBLAS-wheels/releases/tag/textgen-webui
57
+
58
+ for example:
59
+ ```
60
+ pip install https://github.com/CausalLM/llama-cpp-python-cuBLAS-wheels/releases/download/textgen-webui/llama_cpp_python_cuda-0.2.21+cu121basic-cp39-cp39-manylinux_2_31_x86_64.whl
61
  ```