--- license: mit tags: - generated_from_trainer datasets: qfrodicio/gesture-prediction-9-classes metrics: - accuracy - precision - recall - f1 model-index: - name: roberta-finetuned-gesture-prediction-9-classes results: [] --- # roberta-finetuned-gesture-prediction-9-classes This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset. It achieves the following results on the validation set: - Loss: 0.6668 - Accuracy: 0.8289 - Precision: 0.8288 - Recall: 0.8289 - F1: 0.8258 It achieves the following results on the test set: - Loss: 0.6158 - Accuracy: 0.83 - Precision: 0.8296 - Recall: 0.83 - F1: 0.8274 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data The model has been trained with the qfrodicio/gesture-prediction-9-classes dataset ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - weight_decay: 0.01 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 1.7138 | 1.0 | 87 | 1.0975 | 0.6915 | 0.6290 | 0.6915 | 0.6303 | | 0.846 | 2.0 | 174 | 0.7497 | 0.7948 | 0.7790 | 0.7948 | 0.7772 | | 0.5545 | 3.0 | 261 | 0.7078 | 0.8020 | 0.8000 | 0.8020 | 0.7927 | | 0.3955 | 4.0 | 348 | 0.6668 | 0.8289 | 0.8288 | 0.8289 | 0.8258 | | 0.279 | 5.0 | 435 | 0.6922 | 0.8291 | 0.8340 | 0.8291 | 0.8277 | | 0.2203 | 6.0 | 522 | 0.6955 | 0.8373 | 0.8390 | 0.8373 | 0.8336 | | 0.1595 | 7.0 | 609 | 0.7149 | 0.8395 | 0.8405 | 0.8395 | 0.8365 | | 0.1349 | 8.0 | 696 | 0.7065 | 0.8436 | 0.8447 | 0.8436 | 0.8399 | | 0.1047 | 9.0 | 783 | 0.7408 | 0.8481 | 0.8502 | 0.8481 | 0.8445 | | 0.0906 | 10.0 | 870 | 0.7439 | 0.8495 | 0.8501 | 0.8495 | 0.8465 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.1+cu116 - Datasets 2.10.1 - Tokenizers 0.13.2