timm/vit_large_patch14_clip_224.openai_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 11.1k • 36
timm/mobilenetv4_conv_aa_large.e230_r448_in12k_ft_in1k Image Classification • Updated Sep 18 • 3.79k • 2
timm/vit_large_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 13.5k
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 3.62k • 2
timm/vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 2.56k
timm/vit_large_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 1.01k
timm/vit_base_patch32_clip_224.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 3.99k • 1
timm/vit_base_patch32_clip_384.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 868
timm/vit_base_patch32_clip_448.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 20.3k • 1
timm/vit_base_patch32_clip_384.openai_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 4.82k
timm/vit_base_patch16_clip_384.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 1.52k • 4
timm/vit_large_patch14_clip_336.openai_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 1.23k • 1
timm/vit_base_patch16_clip_224.laion2b_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 9.96k • 2
timm/vit_base_patch16_clip_384.openai_ft_in12k_in1k Image Classification • Updated May 6, 2023 • 497 • 1