Transformers
English
Inference Endpoints
bjoernp commited on
Commit
54a8e9d
1 Parent(s): 2957684

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -6,7 +6,6 @@ datasets:
6
  language:
7
  - en
8
  library_name: transformers
9
- pipeline_tag: text-generation
10
  ---
11
  # Model Card for Alpaca Cerebras-6.7B LoRA
12
 
@@ -51,6 +50,7 @@ See [github.com/bjoernpl/cerebras-lora](https://github.com/bjoernpl/cerebras-lor
51
  This model can be easily loaded using the AutoModelForCausalLM functionality:
52
  ```python
53
  from transformers import AutoTokenizer, AutoModelForCausalLM
 
54
  tokenizer = AutoTokenizer.from_pretrained("cerebras/Cerebras-GPT-6.7B")
55
  model = AutoModelForCausalLM.from_pretrained("cerebras/Cerebras-GPT-6.7B", torch_dtype=torch.float16, device_map='auto', load_in_8bit=True)
56
  model = PeftModel.from_pretrained(model, "bjoernp/alpaca-cerebras-6.7B", torch_dtype=torch.float16, device_map='auto')
 
6
  language:
7
  - en
8
  library_name: transformers
 
9
  ---
10
  # Model Card for Alpaca Cerebras-6.7B LoRA
11
 
 
50
  This model can be easily loaded using the AutoModelForCausalLM functionality:
51
  ```python
52
  from transformers import AutoTokenizer, AutoModelForCausalLM
53
+ from peft import PeftModel
54
  tokenizer = AutoTokenizer.from_pretrained("cerebras/Cerebras-GPT-6.7B")
55
  model = AutoModelForCausalLM.from_pretrained("cerebras/Cerebras-GPT-6.7B", torch_dtype=torch.float16, device_map='auto', load_in_8bit=True)
56
  model = PeftModel.from_pretrained(model, "bjoernp/alpaca-cerebras-6.7B", torch_dtype=torch.float16, device_map='auto')