Could you please provide a script for inference?
#3
by
zhangtaokd
- opened
I tried to load the model using the following code, but it got stuck.
import requests
from PIL import Image
import torch
from transformers import AutoProcessor, LlavaForConditionalGeneration
model_id = "./Yi-VL-6B"
model = LlavaForConditionalGeneration.from_pretrained(
model_id,
torch_dtype=torch.float16,
low_cpu_mem_usage=True,
).to(0)
As it seems like your question has been answered, if there is nothing else we can help you with on this matter, I will be closing this discussion for now.
If you have any further questions, feel free to reopen this discussion or start a new one.
Thank you for your contribution to this community!
richardllin
changed discussion status to
closed