--- library_name: transformers tags: - generated_from_trainer model-index: - name: paper-cutting results: [] datasets: - hidonbush/paper-cuttingv0.1 language: - en - zh metrics: - accuracy base_model: - nvidia/mit-b5 --- # paper-cutting This model was a finetuned version of nvidia/mit-b5 on the paper-cutting datasetv0.1. It was trained to extract body contents from any resources like articles and books, just like cutting them off the paper. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data paper-cutting v0.1 ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Framework versions - Transformers 4.45.1 - Pytorch 2.4.0 - Datasets 3.0.1 - Tokenizers 0.20.0