Spaces:
Running
Running
Models,URL | |
clip-vit-large-patch14,https://huggingface.co/openai/clip-vit-large-patch14 | |
blip2-opt-2.7b,https://huggingface.co/Salesforce/blip2-opt-2.7b | |
siglip-base-patch16-224,https://huggingface.co/google/siglip-base-patch16-224 | |
open_clip-ViT-L/14,https://github.com/mlfoundations/open_clip | |
e5-v,https://huggingface.co/royokong/e5-v | |
Magiclens,https://github.com/google-deepmind/magiclens | |
MMRet,https://huggingface.co/JUNJIE99/MMRet-large | |
VLM2Vec-Phi-3.5-v,https://huggingface.co/TIGER-Lab/VLM2Vec-Full | |
VLM2Vec,https://github.com/TIGER-AI-Lab/VLM2Vec | |
VLM2Vec (Qwen2-VL-7B-LoRA-HighRes),https://huggingface.co/TIGER-Lab/VLM2Vec-Qwen2VL-7B | |
VLM2Vec (Qwen2-VL-2B-LoRA-HighRes),https://huggingface.co/TIGER-Lab/VLM2Vec-Qwen2VL-2B | |
UniIR,https://huggingface.co/TIGER-Lab/UniIR | |
OpenCLIP-FT,https://doi.org/10.48550/arXiv.2212.07143 | |
CLIP-FT,https://doi.org/10.48550/arXiv.2103.00020 | |
mmE5,https://huggingface.co/intfloat/mmE5-mllama-11b-instruct | |
gme-Qwen2-VL-2B-Instruct,https://huggingface.co/Alibaba-NLP/gme-Qwen2-VL-2B-Instruct | |
MM-Embed,https://huggingface.co/nvidia/MM-Embed | |
LLaVE-7B,https://huggingface.co/zhibinlan/LLaVE-7B | |
LLaVE-2B,https://huggingface.co/zhibinlan/LLaVE-2B | |
LLaVE-0.5B,https://huggingface.co/zhibinlan/LLaVE-0.5B | |
UniME(LLaVA-OneVision-7B-LoRA-Res336),https://huggingface.co/DeepGlint-AI/UniME-LLaVA-OneVision-7B | |
UniME(LLaVA-1.6-7B-LoRA-LowRes),https://huggingface.co/DeepGlint-AI/UniME-LLaVA-1.6-7B | |
UniME(Phi-3.5-V-LoRA),https://huggingface.co/DeepGlint-AI/UniME-Phi3.5-V-4.2B | |
QQMM-embed,https://github.com/QQ-MM/QQMM-embed |