--- license: apache-2.0 datasets: - imagenet-1k metrics: - accuracy pipeline_tag: image-classification tags: - pytorch - torch-dag --- # Model Card for efficientnetv2_b0_pruned_53 This is a prunned version of the [Keras EfficientNetV2B0](https://keras.io/api/applications/efficientnet_v2/#efficientnetv2b0-function) model in a [toch-dag](https://github.com/TCLResearchEurope/torch-dag) format. This model has rougly 53% of the original model FLOPs with minimal metrics drop. | Model | KMAPPs* | M Parameters | Accuracy (224x224) | | ----------- | ----------- | ----------- | ------------------ | | **Keras EfficientNetV2B (baseline)** | 29 | 7.1 | 78.69% | | **efficientnetv2_b0_pruned_53 (ours)** | 15.4 **(53%)** | 5 **(70%)** | 77.69% **(↓ 1.0%)** | \***KMAPPs** thousands of FLOPs per input pixel `KMAPPs(model) = FLOPs(model) / (H * W * 1000)`, where `(H, W)` is the input resolution. The accuracy was calculated on the ImageNet-1k validation dataset. For details about image pre-processing, please refer to the original repository. ## Model Details ### Model Description - **Developed by:** [TCL Research Europe](https://github.com/TCLResearchEurope/) - **Model type:** Classification / feature backbone - **License:** Apache 2.0 - **Finetuned from model:** [Keras EfficientNetV2B0](https://keras.io/api/applications/efficientnet_v2/#efficientnetv2b0-function) ### Model Sources - **Repository:** [Keras EfficientNetV2B0](https://keras.io/api/applications/efficientnet_v2/#efficientnetv2b0-function) ## How to Get Started with the Model To load the model, You have to install [torch-dag](https://github.com/TCLResearchEurope/torch-dag#3-installation) library, which can be done using `pip` by ``` pip install torch-dag ``` then, clone this repository ``` # Make sure you have git-lfs installed (https://git-lfs.com) git lfs install git clone https://huggingface.co/TCLResearchEurope/efficientnetv2_b0_pruned_53 ``` and now You are ready to load the model: ``` import torch_dag import torch model = torch_dag.io.load_dag_from_path('./efficientnetv2_b0_pruned_53') model.eval() out = model(torch.ones(1, 3, 224, 224)) print(out.shape) ```