Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Controlled Text Reduction Aviv Slobodkin, Paul Roit, Eran Hirsch, Ori Ernst, Ido Dagan, 2022. PDF

How to use?

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
  
tokenizer = AutoTokenizer.from_pretrained("biu-nlp/led-base-controlled-text-reduction")
model = AutoModelForSeq2SeqLM.from_pretrained("biu-nlp/led-base-controlled-text-reduction")

The original repo is here.

If you find our work useful, please cite the paper as:

@misc{https://doi.org/10.48550/arxiv.2210.13449,
  doi = {10.48550/ARXIV.2210.13449},
  url = {https://arxiv.org/abs/2210.13449},
  author = {Slobodkin, Aviv and Roit, Paul and Hirsch, Eran and Ernst, Ori and Dagan, Ido},
  keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
  title = {Controlled Text Reduction},
  publisher = {arXiv},
  year = {2022},
  copyright = {Creative Commons Zero v1.0 Universal}
}
Downloads last month
3