--- license: apache-2.0 language: - en metrics: - f1 --- # switch-base-8-finetuned This model is a fine-tuned version of [google/switch-base-8](https://huggingface.co/google/switch-base-8) on the SemEval-2018-Task-2 emojis english dataset. It achieves the following results on the evaluation set: - Accuracy: 50.174 % - Mac-F1: 36.660 % # Model description ## More information needed - **Model type:** Language model - **Language(s) (NLP):** English - **License:** Apache 2.0 - **Related Models:** [All Switch Transformers Checkpoints](https://huggingface.co/models?search=switch) - **Original Checkpoints:** [All Original Switch Transformers Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#mixture-of-experts-moe-checkpoints) - **Resources for more information:** - [Research paper](https://arxiv.org/pdf/2101.03961.pdf) - [GitHub Repo](https://github.com/google-research/t5x) - [Hugging Face Switch Transformers Docs (Similar to T5) ](https://huggingface.co/docs/transformers/model_doc/switch_transformers) ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-4 - train_batch_size: 464 - eval_batch_size: 512 - seed: 42 - num_epochs: 30 ### Testing results | SemEval Testing Data | accuracy | Mac-F1 | |:---------------------------------------------------:|:------------:|:----------:| | "Tubingen-Oslo" First SemEval Team | 47.09% | 35.99% | | [switch-base-8-finetuned-SemEval-2018-emojis-cen-1](https://huggingface.co/Karim-Gamal/switch-base-8-finetuned-SemEval-2018-emojis-cen-1) | 48.040% | 33.239% | | [switch-base-8-finetuned-SemEval-2018-emojis-cen-2](https://huggingface.co/Karim-Gamal/switch-base-8-finetuned-SemEval-2018-emojis-cen-2) | 50.174% | 36.660% | | [switch-base-8-finetuned-SemEval-2018-emojis-IID-Fed](https://huggingface.co/Karim-Gamal/switch-base-8-finetuned-SemEval-2018-emojis-IID-Fed) | 50.750% | 37.355% | ## Google colab to test the models on SemEval test dataset : [The Notebook](https://colab.research.google.com/drive/1CJWfCyT8ofz1xg6W_F5YCMyTpCs36_PP?usp=sharing) ### Framework versions - Transformers 4.25.1 - Pytorch 1.13.1+cu116 - Tokenizers 0.13.2