My BookSum Models
Collection
Trained on ubaada/booksum-complete-cleaned dataset.
•
4 items
•
Updated
This model is a fine-tuned version of ubaada/pegasus-x-large-booksum-16k on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel |
---|---|---|---|---|---|---|
1.3001 | 0.9992 | 314 | 1.8948 | 0.3044 | 0.0517 | 0.1398 |
Unable to build the model tree, the base model loops to the model itself. Learn more.