The current text generation call will exceed the model's predefined maximum length (4096). Depending on the model, you may observe exceptions, performance degradation, or nothing at all.

#3
by rjmehta - opened
This comment has been hidden

@TheBloke Please help. Thanks

rjmehta changed discussion status to closed

Sign up or log in to comment