pszemraj commited on
Commit
31e311e
·
1 Parent(s): e9e4ac2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -0
README.md CHANGED
@@ -1,3 +1,31 @@
1
  ---
2
  license: cc-by-nc-sa-4.0
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-nc-sa-4.0
3
+ language:
4
+ - en
5
+ library_name: transformers
6
+ pipeline_tag: text-generation
7
  ---
8
+ # StableLM-Tuned-Alpha: sharded checkpoint
9
+
10
+ This is a sharded checkpoint (with ~2GB shards) of the model. Refer to the [original model](https://huggingface.co/stabilityai/stablelm-tuned-alpha-3b) for all details.
11
+
12
+
13
+ ## Basic Usage
14
+
15
+
16
+ install `transformers`, `accelerate`, and `bitsandbytes`.
17
+
18
+ ```bash
19
+ pip install -U
20
+ ```
21
+
22
+ Load the model in 8bit, then [run inference](https://huggingface.co/docs/transformers/generation_strategies#contrastive-search):
23
+
24
+ ```python
25
+ from transformers import AutoTokenizer, AutoModelForCausalLM
26
+
27
+ model_name = "ethzanalytics/stablelm-tuned-alpha-3b-sharded"
28
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
29
+
30
+ model = AutoModelForCausalLM.from_pretrained(model_name, load_in_8bit=True)
31
+ ```