timm/eva02_enormous_patch14_plus_clip_224.laion2b_s9b_b144k Zero-Shot Image Classification • Updated Feb 10 • 14.7k • 6
laion/CLIP-convnext_large_d.laion2B-s26B-b102K-augreg Zero-Shot Image Classification • Updated Apr 18, 2023 • 13k • 4
laion/CLIP-convnext_base_w-laion2B-s13B-b82K Zero-Shot Image Classification • Updated Apr 18, 2023 • 12.6k • 3
laion/CLIP-convnext_base_w_320-laion_aesthetic-s13B-b82K Zero-Shot Image Classification • Updated Apr 18, 2023 • 7.96k • 1
laion/CLIP-convnext_base_w-laion_aesthetic-s13B-b82K Zero-Shot Image Classification • Updated Apr 18, 2023 • 5.51k • 4
laion/CLIP-ViT-B-16-DataComp.L-s1B-b8K Zero-Shot Image Classification • Updated Apr 26, 2023 • 5.38k • 1
laion/CLIP-ViT-B-16-DataComp.XL-s13B-b90K Zero-Shot Image Classification • Updated Sep 29, 2023 • 5.19k • 5
OFA-Sys/chinese-clip-vit-large-patch14 Zero-Shot Image Classification • Updated Dec 9, 2022 • 4.96k • 21
timm/eva02_large_patch14_clip_224.merged2b_s4b_b131k Zero-Shot Image Classification • Updated Feb 10 • 4.76k • 5
laion/CLIP-ViT-B-32-DataComp.M-s128M-b4K Zero-Shot Image Classification • Updated Apr 26, 2023 • 3.94k
wkcn/TinyCLIP-ViT-8M-16-Text-3M-YFCC15M Zero-Shot Image Classification • Updated Dec 19, 2023 • 3.17k • 4
timm/eva02_large_patch14_clip_336.merged2b_s6b_b61k Zero-Shot Image Classification • Updated Feb 10 • 2.81k