TheBloke commited on
Commit
17b2e2a
1 Parent(s): 250eba0

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -102,7 +102,7 @@ All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches
102
 
103
  To download from the `main` branch, enter `TheBloke/Xwin-LM-70B-V0.1-GPTQ` in the "Download model" box.
104
 
105
- To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/Xwin-LM-70B-V0.1-GPTQ:main`
106
 
107
  ### From the command line
108
 
@@ -123,7 +123,7 @@ To download from a different branch, add the `--revision` parameter:
123
 
124
  ```shell
125
  mkdir Xwin-LM-70B-V0.1-GPTQ
126
- huggingface-cli download TheBloke/Xwin-LM-70B-V0.1-GPTQ --revision main --local-dir Xwin-LM-70B-V0.1-GPTQ --local-dir-use-symlinks False
127
  ```
128
 
129
  <details>
@@ -156,7 +156,7 @@ Windows Command Line users: You can set the environment variable by running `set
156
  To clone a specific branch with `git`, use a command like this:
157
 
158
  ```shell
159
- git clone --single-branch --branch main https://huggingface.co/TheBloke/Xwin-LM-70B-V0.1-GPTQ
160
  ```
161
 
162
  Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
@@ -171,7 +171,7 @@ It is strongly recommended to use the text-generation-webui one-click-installers
171
 
172
  1. Click the **Model tab**.
173
  2. Under **Download custom model or LoRA**, enter `TheBloke/Xwin-LM-70B-V0.1-GPTQ`.
174
- - To download from a specific branch, enter for example `TheBloke/Xwin-LM-70B-V0.1-GPTQ:main`
175
  - see Provided Files above for the list of branches for each option.
176
  3. Click **Download**.
177
  4. The model will start downloading. Once it's finished it will say "Done".
@@ -212,7 +212,7 @@ from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
212
 
213
  model_name_or_path = "TheBloke/Xwin-LM-70B-V0.1-GPTQ"
214
  # To use a different branch, change revision
215
- # For example: revision="main"
216
  model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
217
  device_map="auto",
218
  trust_remote_code=False,
 
102
 
103
  To download from the `main` branch, enter `TheBloke/Xwin-LM-70B-V0.1-GPTQ` in the "Download model" box.
104
 
105
+ To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/Xwin-LM-70B-V0.1-GPTQ:gptq-4bit-128g-actorder_True`
106
 
107
  ### From the command line
108
 
 
123
 
124
  ```shell
125
  mkdir Xwin-LM-70B-V0.1-GPTQ
126
+ huggingface-cli download TheBloke/Xwin-LM-70B-V0.1-GPTQ --revision gptq-4bit-128g-actorder_True --local-dir Xwin-LM-70B-V0.1-GPTQ --local-dir-use-symlinks False
127
  ```
128
 
129
  <details>
 
156
  To clone a specific branch with `git`, use a command like this:
157
 
158
  ```shell
159
+ git clone --single-branch --branch gptq-4bit-128g-actorder_True https://huggingface.co/TheBloke/Xwin-LM-70B-V0.1-GPTQ
160
  ```
161
 
162
  Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
 
171
 
172
  1. Click the **Model tab**.
173
  2. Under **Download custom model or LoRA**, enter `TheBloke/Xwin-LM-70B-V0.1-GPTQ`.
174
+ - To download from a specific branch, enter for example `TheBloke/Xwin-LM-70B-V0.1-GPTQ:gptq-4bit-128g-actorder_True`
175
  - see Provided Files above for the list of branches for each option.
176
  3. Click **Download**.
177
  4. The model will start downloading. Once it's finished it will say "Done".
 
212
 
213
  model_name_or_path = "TheBloke/Xwin-LM-70B-V0.1-GPTQ"
214
  # To use a different branch, change revision
215
+ # For example: revision="gptq-4bit-128g-actorder_True"
216
  model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
217
  device_map="auto",
218
  trust_remote_code=False,