|
--- |
|
datasets: |
|
- EleutherAI/pile |
|
language: |
|
- en |
|
tags: |
|
- t5x |
|
- encoder-decoder |
|
--- |
|
|
|
This is the T5x version of Pile-T5 Large. You can use these checkpoints to continue |
|
pretraining or finetune using the [T5x](https://github.com/google-research/t5x) library. |
|
Scripts used to train Pile-T5 are available in the [improved-t5 repository](https://github.com/EleutherAI/improved-t5) |
|
on github. To access a specific step, refer to their respective branch `step_TRAININGSTEP`. The `main` branch is left empty |
|
|
|
For the HF version, please refer [here](https://huggingface.co/EleutherAI/pile-t5-large) |
|
|
|
### BibTeX |
|
|
|
``` |
|
@misc{2024PileT5, |
|
author = {Lintang Sutawika and Aran Komatsuzaki and Colin Raffel}, |
|
title = {Pile-T5}, |
|
year = {2024}, |
|
url = {https://blog.eleuther.ai/pile-t5/}, |
|
note = {Blog post}, |
|
} |
|
``` |