This model is part of the GrammarCorrector tool.

"FlanT5 from scratch for the grammar correction tool" article about how this model was trained:

FlanT5 was trained using JFLEG dataset. The primary objective of the experiment was to develop a highly effective tool using relatively small models, minimal datasets, and constrained computational resources.

To accomplish this goal, we implemented two key strategies:

Downloads last month
5,797
Safetensors
Model size
248M params
Tensor type
F32
ยท
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for akhmat-s/t5-base-grammar-corrector

Base model

google-t5/t5-base
Finetuned
(480)
this model

Dataset used to train akhmat-s/t5-base-grammar-corrector

Space using akhmat-s/t5-base-grammar-corrector 1