Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
Decoding Paramters?
#59
by
dataviral
- opened
Hi all,
I would like to know the decoding parameters used when generating outputs from the LLMs.
Were they kept as default, what was the max_new_token length.
My struggle with OS LLMs is that they repeat content too often and do not terminate generation at appropriate internals.
It would be good to learn more about the parameters used during evaluations.
Thanks!
The parameters were the default ones from the EleutherAI Harness :)
clefourrier
changed discussion status to
closed