File size: 324 Bytes
cf751a0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 |
```py
checkpoint = "openai/clip-vit-large-patch14"
model = CLIPTextModel.from_pretrained(
checkpoint, torch_dtype=torch.bfloat16
)
tokenizer = CLIPTokenizer.from_pretrained(
checkpoint
)
model.push_to_hub("ariG23498/clip-vit-large-patch14-torch")
tokenizer.push_to_hub("ariG23498/clip-vit-large-patch14-torch")
``` |