EFTNAS Model Card: eftnas-s2-bert-medium
The super-networks fine-tuned on BERT-medium with GLUE benchmark using EFTNAS.
Model Details
Information
- Model name: eftnas-s2-bert-medium-[TASK]
- Base model: google/bert_uncased_L-8_H-512_A-8
- Subnetwork version: Super-network
- NNCF Configurations: eftnas_configs
Training and Evaluation
Results
Results of the optimal sub-network discoverd from the super-network:
Model | GFLOPs | GLUE Avg. | MNLI-m | QNLI | QQP | SST-2 | CoLA | MRPC | RTE |
---|---|---|---|---|---|---|---|---|---|
Test Set: | |||||||||
EFTNAS-S1 | 5.7 | 77.7 | 83.7 | 89.9 | 71.8 | 93.4 | 52.6 | 87.6 | 65.0 |
EFTNAS-S2 | 2.2 | 75.2 | 82.0 | 87.8 | 70.6 | 91.4 | 44.5 | 86.1 | 64.0 |
Model Sources
- Repository: https://github.com/IntelLabs/Hardware-Aware-Automated-Machine-Learning/tree/main/EFTNAS
- Paper: Searching for Efficient Language Models in First-Order Weight-Reordered Super-Networks
Citation
@inproceedings{
eftnas2024,
title={Searching for Efficient Language Models in First-Order Weight-Reordered Super-Networks},
author={J. Pablo Munoz and Yi Zheng and Nilesh Jain},
booktitle={The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation},
year={2024},
url={}
}
License
Apache-2.0