How to load the som-llava model using the transformers library?

#1
by dyliu - opened

I attempted to use the following code, but unfortunately, it didn't work out:

model = LlavaForConditionalGeneration.from_pretrained("zzxslp/som-llava-v1.5-13b").to('cuda').eval()
processor = AutoProcessor.from_pretrained("zzxslp/som-llava-v1.5-13b")

I'm wondering if it's possible to directly load the som-llava model using the Transformers library. Is this functionality currently supported, or is it not compatible with this approach?

Owner
zzxslp changed discussion status to closed

Sign up or log in to comment