Edit model card

Model Card for CoEdIT-Large

This model was obtained by fine-tuning the corresponding google/flan-t5-large model on the CoEdIT dataset. Details of the dataset can be found in our paper and repository.

Paper: CoEdIT: Text Editing by Task-Specific Instruction Tuning

Authors: Vipul Raheja, Dhruv Kumar, Ryan Koo, Dongyeop Kang

Model Details

Model Description

  • Language(s) (NLP): English
  • Finetuned from model: google/flan-t5-large

Model Sources

How to use

We make available the models presented in our paper.

Model Number of parameters
CoEdIT-large 770M
CoEdIT-xl 3B
CoEdIT-xxl 11B

Uses

Text Revision Task

Given an edit instruction and an original text, our model can generate the edited version of the text.

task_specs

Usage

from transformers import AutoTokenizer, T5ForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained("grammarly/coedit-large")
model = T5ForConditionalGeneration.from_pretrained("grammarly/coedit-large")
input_text = 'Fix grammatical errors in this sentence: When I grow up, I start to understand what he said is quite right.'
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
outputs = model.generate(input_ids, max_length=256)
edited_text = tokenizer.decode(outputs[0], skip_special_tokens=True)

Software

https://github.com/vipulraheja/coedit

Citation

BibTeX:

@article{raheja2023coedit,
      title={CoEdIT: Text Editing by Task-Specific Instruction Tuning}, 
      author={Vipul Raheja and Dhruv Kumar and Ryan Koo and Dongyeop Kang},
      year={2023},
      eprint={2305.09857},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

APA: Raheja, V., Kumar, D., Koo, R., & Kang, D. (2023). CoEdIT: Text Editing by Task-Specific Instruction Tuning. ArXiv. /abs/2305.09857

Downloads last month
2,561
Safetensors
Model size
783M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for grammarly/coedit-large

Finetunes
2 models
Quantizations
1 model

Datasets used to train grammarly/coedit-large

Spaces using grammarly/coedit-large 11

Collection including grammarly/coedit-large