--- tags: - generated_from_trainer datasets: - wikitext model-index: - name: opt-2.7b-wikitext2 results: [] --- # opt-2.7b-wikitext2 This model is a fine-tuned version of facebook/opt-2.7b on the wikitext wikitext-2-raw-v1 dataset. It achieves the following results on the evaluation set: - eval_loss: 2.3983 - eval_accuracy: 0.5033 - eval_runtime: 100.7746 - eval_samples_per_second: 2.411 - eval_steps_per_second: 1.211 - step: 0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2.0 ### Framework versions - Transformers 4.27.4 - Pytorch 1.13.0+cu116 - Datasets 2.9.0 - Tokenizers 0.13.2