Transformers
Inference Endpoints
cyrilzhang commited on
Commit
f50db44
1 Parent(s): 1e5b7d8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -22,6 +22,5 @@ gpt2_tokenizer.decode([10163, 46387]) # '123 pigeon'
22
  ```
23
 
24
  - This is for my investigations into the arithmetic capabilities of large language models. There is no model here, only a tokenizer.
25
- - [PaLM](https://arxiv.org/abs/2204.02311) does this.
26
- - I think it's very reasonable.
27
  - Many models (illustriously, [GPT-3](https://arxiv.org/abs/2005.14165)) don't do this, because they use the GPT-2 tokenizer.
 
22
  ```
23
 
24
  - This is for my investigations into the arithmetic capabilities of large language models. There is no model here, only a tokenizer.
25
+ - [PaLM](https://arxiv.org/abs/2204.02311) does this. I think it's very reasonable.
 
26
  - Many models (illustriously, [GPT-3](https://arxiv.org/abs/2005.14165)) don't do this, because they use the GPT-2 tokenizer.