Transformers
Inference Endpoints
cyrilzhang commited on
Commit
1e5b7d8
1 Parent(s): ed23458

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -24,4 +24,4 @@ gpt2_tokenizer.decode([10163, 46387]) # '123 pigeon'
24
  - This is for my investigations into the arithmetic capabilities of large language models. There is no model here, only a tokenizer.
25
  - [PaLM](https://arxiv.org/abs/2204.02311) does this.
26
  - I think it's very reasonable.
27
- - Many models (illustriously, [GPT-3](https://arxiv.org/abs/2005.14165)) use the GPT-2 tokenizer, which doesn't do this.
 
24
  - This is for my investigations into the arithmetic capabilities of large language models. There is no model here, only a tokenizer.
25
  - [PaLM](https://arxiv.org/abs/2204.02311) does this.
26
  - I think it's very reasonable.
27
+ - Many models (illustriously, [GPT-3](https://arxiv.org/abs/2005.14165)) don't do this, because they use the GPT-2 tokenizer.