orena commited on
Commit
cc5bffa
1 Parent(s): 7971745

Set max_length to 50 to avoid deprecation message

Browse files

When running without `max_length` we are getting this error:
```
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
/home/oamsalem/.local/lib/python3.9/site-packages/transformers/generation_utils.py:1359: UserWarning: Neither `max_length` nor `max_new_tokens` has been set, `max_length` will default to 50 (`self.config.max_length`). Controlling `max_length` via the config is deprecated and `max_length` will be removed from the config in v5 of Transformers -- we recommend using `max_new_tokens` to control the maximum length of the generation.
warnings.warn(
```
Better to set it to avoid this message

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -35,7 +35,7 @@ You can use this model directly with a pipeline for text generation. This exampl
35
  ```py
36
  >>> from transformers import pipeline
37
  >>> generator = pipeline('text-generation', model='EleutherAI/gpt-neo-1.3B')
38
- >>> generator("EleutherAI has", do_sample=True, min_length=50)
39
 
40
  [{'generated_text': 'EleutherAI has made a commitment to create new software packages for each of its major clients and has'}]
41
  ```
35
  ```py
36
  >>> from transformers import pipeline
37
  >>> generator = pipeline('text-generation', model='EleutherAI/gpt-neo-1.3B')
38
+ >>> generator("EleutherAI has", do_sample=True, min_length=50, max_length=50)
39
 
40
  [{'generated_text': 'EleutherAI has made a commitment to create new software packages for each of its major clients and has'}]
41
  ```