EFTNAS Model Card: eftnas-s1-bert-base
The super-networks fine-tuned on BERT-base with GLUE benchmark using EFTNAS.
Model Details
Information
- Model name: eftnas-s1-bert-base-[TASK]
- Base model: bert-base-uncased
- Subnetwork version: Super-network
- NNCF Configurations: eftnas_configs
Training and Evaluation
Results
Results of the optimal sub-network discoverd from the super-network:
Model | GFLOPs | GLUE Avg. | MNLI-m | QNLI | QQP | SST-2 | CoLA | MRPC | RTE |
---|---|---|---|---|---|---|---|---|---|
Development Set: | |||||||||
EFTNAS-S1 | 5.7 | 82.9 | 84.6 | 90.8 | 91.2 | 93.5 | 60.6 | 90.8 | 69.0 |
Test Set: | |||||||||
EFTNAS-S1 | 5.7 | 77.7 | 83.7 | 89.9 | 71.8 | 93.4 | 52.6 | 87.6 | 65.0 |
Model Sources
- Repository: https://github.com/IntelLabs/Hardware-Aware-Automated-Machine-Learning/tree/main/EFTNAS
- Paper: Searching for Efficient Language Models in First-Order Weight-Reordered Super-Networks
Citation
@inproceedings{
eftnas2024,
title={Searching for Efficient Language Models in First-Order Weight-Reordered Super-Networks},
author={J. Pablo Munoz and Yi Zheng and Nilesh Jain},
booktitle={The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation},
year={2024},
url={}
}
License
Apache-2.0