T5-Flan-Small / README.md
Sayan01's picture
Update README.md
db95f35
|
raw
history blame contribute delete
No virus
381 Bytes
metadata
license: apache-2.0
datasets:
  - Open-Orca/FLAN
language:
  - en
pipeline_tag: text2text-generation

T5-small distilled from T5-flan-base using OpenOrca'a FLAN dataset using Seq2seq methodology. The baseline for Seq2seq distillation on T5 decoder models. It is free under Apache License.

The distilled model is a T5-small model with 60M parameters, not a T5-flan-small model.