TheBloke commited on
Commit
0829129
1 Parent(s): 5d91b2e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -7
README.md CHANGED
@@ -45,16 +45,13 @@ Now that we have ExLlama, that is the recommended loader to use for these models
45
 
46
  Reminder: ExLlama does not support 3-bit models, so if you wish to try those quants, you will need to use AutoGPTQ or GPTQ-for-LLaMa.
47
 
48
- ## AutoGPTQ and GPTQ-for-LLaMa requires latest version of Transformers
49
 
50
- If you plan to use any of these quants with AutoGPTQ or GPTQ-for-LLaMa, your Transformers needs to be be using the latest Github code.
51
-
52
- If you're using text-generation-webui and have updated to the latest version, this is done for you automatically.
53
-
54
- If not, you can update it manually with:
55
 
 
56
  ```
57
- pip3 install git+https://github.com/huggingface/transformers
58
  ```
59
 
60
  ## Repositories available
 
45
 
46
  Reminder: ExLlama does not support 3-bit models, so if you wish to try those quants, you will need to use AutoGPTQ or GPTQ-for-LLaMa.
47
 
48
+ ## AutoGPTQ and GPTQ-for-LLaMa compatibility
49
 
50
+ Please update AutoGPTQ to version 0.3.1 or later. This will also update Transformers to 4.31.0, which is required for Llama 70B compatibility.
 
 
 
 
51
 
52
+ If you're using GPTQ-for-LLaMa, please update Transformers manually with:
53
  ```
54
+ pip3 install "transformers>=4.31.0"
55
  ```
56
 
57
  ## Repositories available