pe4enov commited on
Commit
f82a40d
1 Parent(s): 6691d9b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -6
README.md CHANGED
@@ -5,18 +5,15 @@ language:
5
 
6
  Квантизированная версия модели <a href="https://huggingface.co/ai-forever/ruGPT-3.5-13B">ruGPT-3.5</a>
7
 
8
- <code>
9
  # Use a pipeline as a high-level helper
10
  from transformers import pipeline
11
 
12
  pipe = pipeline("text-generation", model="pe4enov/ruGPT-3.5-13B-8bit")
13
- </code>
14
 
15
-
16
- <code>
17
  # Load model directly
18
  from transformers import AutoTokenizer, AutoModelForCausalLM
19
 
20
- tokenizer = AutoTokenizer.from_pretrained("pe4enov/ruGPT-3.5-13B-8bit")
21
  model = AutoModelForCausalLM.from_pretrained("pe4enov/ruGPT-3.5-13B-8bit")
22
- </code>
 
5
 
6
  Квантизированная версия модели <a href="https://huggingface.co/ai-forever/ruGPT-3.5-13B">ruGPT-3.5</a>
7
 
8
+ ``` python
9
  # Use a pipeline as a high-level helper
10
  from transformers import pipeline
11
 
12
  pipe = pipeline("text-generation", model="pe4enov/ruGPT-3.5-13B-8bit")
 
13
 
 
 
14
  # Load model directly
15
  from transformers import AutoTokenizer, AutoModelForCausalLM
16
 
17
+ tokenizer = AutoTokenizer.from_pretrained("ai-forever/ruGPT-3.5-13B")
18
  model = AutoModelForCausalLM.from_pretrained("pe4enov/ruGPT-3.5-13B-8bit")
19
+ ```