Defines the max length value H2O LLM Studio uses for the generated text.

- Similar to the **Max Length** setting in the *tokenizer settings* section, this setting specifies the maximum number of tokens to predict for a given prediction sample.
- This setting impacts the evaluation metrics and should depend on the dataset and average output sequence length that is expected to be predicted.