Cannot use the example to make it work with OpenCLIP
#1
by
matbreotten
- opened
The following command shown as an example did not work for me in Python 3.9 having installed the latest version:
pip install open-clip-torch
import open_clip
model, preprocess_train, preprocess_val = open_clip.create_model_and_transforms('hf-hub:patrickjohncyh/fashion-clip')
tokenizer = open_clip.get_tokenizer('hf-hub:patrickjohncyh/fashion-clip')
Should the example be used with the CLIP model from OpenAI? Can you update the example code to be able to run the model?
This should work with the HF API, if you can point us to how to adapt the model to open clip APIs happy to work on this
Hi
@vinid
, thank you for the Google Colab - I made it work now with the transformers
API from Hugginface using the CLIPProcessor
and CLIPModel
classes.
matbreotten
changed discussion status to
closed