timm/vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 2.64k • 2
timm/vit_large_patch14_clip_224.openai_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 873 • 38
timm/vit_base_patch32_clip_224.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 3.65k • 2
timm/vit_base_patch32_clip_448.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 14.9k • 2
timm/vit_medium_patch16_reg4_gap_256.sbb_in12k_ft_in1k Image Classification • Updated May 27, 2024 • 822 • 2
timm/mobilenetv4_hybrid_medium.e200_r256_in12k_ft_in1k Image Classification • Updated Sep 2, 2024 • 1.43k • 1
timm/vit_large_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 4.23k
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 453 • 2
timm/vit_large_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 1.88k
timm/vit_base_patch32_clip_384.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 463
timm/vit_base_patch32_clip_384.openai_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 2.4k
timm/vit_base_patch16_clip_384.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 1.63k • 4
timm/vit_large_patch14_clip_336.openai_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 329 • 1
timm/vit_base_patch16_clip_224.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 1.61k • 2
timm/vit_base_patch16_clip_384.openai_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 161 • 1