Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
hyeongjin99
/
CLIP-ViT-B-32.laion2b_s34b_b79k-ft
like
0
Zero-Shot Image Classification
OpenCLIP
PyTorch
Safetensors
clip
License:
mit
Model card
Files
Files and versions
Community
Use this model
54f90ea
CLIP-ViT-B-32.laion2b_s34b_b79k-ft
1 contributor
History:
8 commits
hyeongjin99
Rename open_clip_model.safetensors to model.safetensors
54f90ea
verified
4 days ago
.gitattributes
Safe
1.52 kB
initial commit
4 days ago
README.md
Safe
155 Bytes
Add model
4 days ago
config.json
Safe
532 Bytes
Rename open_clip_config.json to config.json
4 days ago
merges.txt
Safe
525 kB
Add model
4 days ago
model.safetensors
Safe
605 MB
LFS
Rename open_clip_model.safetensors to model.safetensors
4 days ago
open_clip_config.json
Safe
532 Bytes
Add model
4 days ago
open_clip_pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
605 MB
LFS
Add model
4 days ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
605 MB
LFS
Rename open_clip_pytorch_model.bin to pytorch_model.bin
4 days ago
special_tokens_map.json
Safe
588 Bytes
Add model
4 days ago
tokenizer.json
Safe
3.64 MB
Add model
4 days ago
tokenizer_config.json
Safe
706 Bytes
Add model
4 days ago
vocab.json
Safe
862 kB
Add model
4 days ago