timm
/

Image Classification
timm
PyTorch
Safetensors
rwightman HF staff commited on
Commit
b53be5f
1 Parent(s): 4b8f42b

Update model config and README

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -37,8 +37,8 @@ Recipe details:
37
  - **Model Type:** Image classification / feature backbone
38
  - **Model Stats:**
39
  - Params (M): 60.4
40
- - GMACs: 16.5
41
- - Activations (M): 28.2
42
  - Image size: 256 x 256
43
  - **Papers:**
44
  - Vision Transformers Need Registers: https://arxiv.org/abs/2309.16588
@@ -139,7 +139,9 @@ output = model.forward_head(output, pre_logits=True)
139
  | model | top1 | top5 | param_count | img_size |
140
  | -------------------------------------------------- | ------ | ------ | ----------- | -------- |
141
  | [vit_mediumd_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_mediumd_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 86.202 | 97.874 | 64.11 | 256 |
142
- | [vit_betwixt_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_betwixt_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 85.418 | 97.48 | 60.4 | 256 |
 
 
143
  | [vit_mediumd_patch16_rope_reg1_gap_256.sbb_in1k](https://huggingface.co/timm/vit_mediumd_patch16_rope_reg1_gap_256.sbb_in1k) | 84.322 | 96.812 | 63.95 | 256 |
144
  | [vit_betwixt_patch16_rope_reg4_gap_256.sbb_in1k](https://huggingface.co/timm/vit_betwixt_patch16_rope_reg4_gap_256.sbb_in1k) | 83.906 | 96.684 | 60.23 | 256 |
145
  | [vit_base_patch16_rope_reg1_gap_256.sbb_in1k](https://huggingface.co/timm/vit_base_patch16_rope_reg1_gap_256.sbb_in1k) | 83.866 | 96.67 | 86.43 | 256 |
 
37
  - **Model Type:** Image classification / feature backbone
38
  - **Model Stats:**
39
  - Params (M): 60.4
40
+ - GMACs: 15.5
41
+ - Activations (M): 18.1
42
  - Image size: 256 x 256
43
  - **Papers:**
44
  - Vision Transformers Need Registers: https://arxiv.org/abs/2309.16588
 
139
  | model | top1 | top5 | param_count | img_size |
140
  | -------------------------------------------------- | ------ | ------ | ----------- | -------- |
141
  | [vit_mediumd_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_mediumd_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 86.202 | 97.874 | 64.11 | 256 |
142
+ | [vit_betwixt_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_betwixt_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 85.418 | 97.480 | 60.4 | 256 |
143
+ | [vit_medium_patch16_reg4_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_medium_patch16_reg4_gap_256.sbb_in12k_ft_in1k) | 84.930 | 97.386 | 38.88 | 256 |
144
+ | [vit_little_patch16_reg1_gap_256.sbb_in12k_ft_in1k](https://huggingface.co/timm/vit_little_patch16_reg1_gap_256.sbb_in12k_ft_in1k) | 83.774 | 96.972 | 22.52 | 256 |
145
  | [vit_mediumd_patch16_rope_reg1_gap_256.sbb_in1k](https://huggingface.co/timm/vit_mediumd_patch16_rope_reg1_gap_256.sbb_in1k) | 84.322 | 96.812 | 63.95 | 256 |
146
  | [vit_betwixt_patch16_rope_reg4_gap_256.sbb_in1k](https://huggingface.co/timm/vit_betwixt_patch16_rope_reg4_gap_256.sbb_in1k) | 83.906 | 96.684 | 60.23 | 256 |
147
  | [vit_base_patch16_rope_reg1_gap_256.sbb_in1k](https://huggingface.co/timm/vit_base_patch16_rope_reg1_gap_256.sbb_in1k) | 83.866 | 96.67 | 86.43 | 256 |