Text2Text Generation
Transformers
PyTorch
Safetensors
English
t5
Inference Endpoints
text-generation-inference
mrm8488 commited on
Commit
edea128
1 Parent(s): 105cfb0

Fix example script

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -73,7 +73,7 @@ model = T5ForConditionalGeneration.from_pretrained("grammarly/coedit-xl")
73
  input_text = 'Fix grammatical errors in this sentence: New kinds of vehicles will be invented with new technology than today.'
74
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids
75
  outputs = model.generate(input_ids, max_length=256)
76
- edited_text = tokenizer.decode(outputs[0], skip_special_tokens=True)[0]
77
  ```
78
 
79
 
73
  input_text = 'Fix grammatical errors in this sentence: New kinds of vehicles will be invented with new technology than today.'
74
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids
75
  outputs = model.generate(input_ids, max_length=256)
76
+ edited_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
77
  ```
78
 
79