michal-sokolski-tcl's picture
Update README.md
9a12c3d
metadata
license: apache-2.0
datasets:
  - imagenet-1k
metrics:
  - accuracy
pipeline_tag: image-classification
tags:
  - pytorch
  - torch-dag

Model Card for swinv2_cr_tiny_ns_224_pruned_59

This is a prunned version of the timm/swinv2_cr_tiny_ns_224.sw_in1k model in a toch-dag format.

This model has rougly 59% of the original model FLOPs with minimal metrics drop.

Model KMAPPs* M Parameters Accuracy (224x224)
timm/swinv2_cr_tiny_ns_224.sw_in1 (baseline) 185.8 28.3 81.53%
swinv2_cr_tiny_ns_224_pruned_59 (ours) 109.4 (59%) 18.2 (64%) 81.07% (↓ 0.46%)

*KMAPPs thousands of FLOPs per input pixel

KMAPPs(model) = FLOPs(model) / (H * W * 1000), where (H, W) is the input resolution.

The accuracy was calculated on the ImageNet-1k validation dataset. For details about image pre-processing, please refer to the original repository.

Model Details

Model Description

Model Sources

How to Get Started with the Model

To load the model, You have to install torch-dag library, which can be done using pip by

pip install torch-dag

then, clone this repository

# Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install
git clone https://huggingface.co/TCLResearchEurope/swinv2_cr_tiny_ns_224_pruned_59

and now You are ready to load the model:

import torch_dag
import torch

model = torch_dag.io.load_dag_from_path('./swinv2_cr_tiny_ns_224_pruned_59')

model.eval()
out = model(torch.ones(1, 3, 224, 224))
print(out.shape)