File size: 817 Bytes
badef99
 
 
 
 
 
 
 
 
 
5248e28
 
 
 
badef99
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
datasets:
- EleutherAI/pile
language:
- en
tags:
- t5x
- encoder-decoder
---

This is the T5x version of Pile-T5 Large. You can use these checkpoints to continue 
pretraining or finetune using the [T5x](https://github.com/google-research/t5x) library.
Scripts used to train Pile-T5 are available in the [improved-t5 repository](https://github.com/EleutherAI/improved-t5) 
on github. To access a specific step, refer to their respective branch `step_TRAININGSTEP`. The `main` branch is left empty

For the HF version, please refer [here](https://huggingface.co/EleutherAI/pile-t5-large)

### BibTeX

```
@misc{2024PileT5,
  author  = {Lintang Sutawika and Aran Komatsuzaki and Colin Raffel},
  title   = {Pile-T5},
  year    = {2024},
  url     = {https://blog.eleuther.ai/pile-t5/},
  note    = {Blog post},
}
```