iquery commited on
Commit
e2fff74
1 Parent(s): 056271d

Add max_new_tokens to the config

Browse files

To avoid the warning
```
UserWarning: Neither `max_length` nor `max_new_tokens` has been set, `max_length` will default to 20 (`self.config.max_length`). Controlling `max_length` via the config is deprecated and `max_length` will be removed from the config in v5 of Transformers -- we recommend using `max_new_tokens` to control the maximum length of the generation.
warnings.warn
```

Files changed (1) hide show
  1. config.json +1 -0
config.json CHANGED
@@ -18,6 +18,7 @@
18
  "output_past": true,
19
  "pad_token_id": 0,
20
  "relative_attention_num_buckets": 32,
 
21
  "task_specific_params": {
22
  "summarization": {
23
  "early_stopping": true,
 
18
  "output_past": true,
19
  "pad_token_id": 0,
20
  "relative_attention_num_buckets": 32,
21
+ "max_new_tokens": 25,
22
  "task_specific_params": {
23
  "summarization": {
24
  "early_stopping": true,