Text Generation
Transformers
Safetensors
gpt_bigcode
code
granite
Eval Results
Inference Endpoints
text-generation-inference
mayank-mishra commited on
Commit
d6dedc3
1 Parent(s): 6b5821c

update example

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -263,7 +263,7 @@ output = model.generate(**input_tokens)
263
  output = tokenizer.batch_decode(output)
264
  # loop over the batch to print, in this example the batch size is 1
265
  for i in output:
266
- print(output)
267
  ```
268
 
269
  ## Training Data
 
263
  output = tokenizer.batch_decode(output)
264
  # loop over the batch to print, in this example the batch size is 1
265
  for i in output:
266
+ print(i)
267
  ```
268
 
269
  ## Training Data