Files changed (1) hide show
  1. README.md +4 -5
README.md CHANGED
@@ -24,7 +24,7 @@ img {
24
  }
25
  </style>
26
 
27
- |[![Model architecture](https://img.shields.io/badge/Model%20Arch-Transformer%20Decoder-green)](#model-architecture)|[![Model size](https://img.shields.io/badge/Params-126M-green)](#model-architecture)|[![Language](https://img.shields.io/badge/Language-en--US-lightgrey#model-badge)](#datasets)
28
 
29
 
30
  ## Model Description
@@ -49,13 +49,12 @@ This model can be easily loaded using the `AutoModelForCausalLM` functionality:
49
 
50
  ```python
51
 
52
- from transformers import AutoModelForCausalLM, AutoTokenizer
53
  import torch
 
54
 
55
- model = AutoModelForCausalLM.from_pretrained("Writer/palmyra-small", torch_dtype=torch.float16).cuda()
56
 
57
- # the fast tokenizer currently does not work correctly
58
- tokenizer = AutoTokenizer.from_pretrained("Writer/palmyra-small", use_fast=False)
59
 
60
 
61
  ```
 
24
  }
25
  </style>
26
 
27
+ |[![Model architecture](https://img.shields.io/badge/Model%20Arch-Transformer%20Decoder-green)](#model-architecture)|[![Model size](https://img.shields.io/badge/Params-128M-green)](#model-architecture)|[![Language](https://img.shields.io/badge/Language-en--US-lightgrey#model-badge)](#datasets)
28
 
29
 
30
  ## Model Description
 
49
 
50
  ```python
51
 
 
52
  import torch
53
+ from transformers import AutoModelForCausalLM, AutoTokenizer
54
 
55
+ model = AutoModelForCausalLM.from_pretrained("Writer/palmyra-small")
56
 
57
+ tokenizer = AutoTokenizer.from_pretrained("Writer/palmyra-small")
 
58
 
59
 
60
  ```