How to use instructblip before PR is merged

#1
by Marcushenriksboe - opened

Hi, I'm really keen on trying out instructblip here before its merged into huggingface. Tried installing transformers from the branch repo but getting "cannot import name 'InstructBlipProcessor' from 'transformers". Wondering if anyone has a quickfix for trying out the branch.

Hi,

Basically you can try it out by doing pip install --upgrade git+https://github.com/NielsRogge/transformers.git@add_instruct_blip

and then do:

from transformers import InstructBlipProcessor, InstructBlipForConditionalGeneration
import torch
from PIL import Image
import requests

model = InstructBlipForConditionalGeneration.from_pretrained("nielsr/instructblip-flan-t5-xl")
processor = InstructBlipProcessor.from_pretrained("nielsr/instructblip-flan-t5-xl")

device = "cuda" if torch.cuda.is_available() else "cpu"
model.to(device)

url = "https://raw.githubusercontent.com/salesforce/LAVIS/main/docs/_static/Confusing-Pictures.jpg"
image = Image.open(requests.get(url, stream=True).raw).convert("RGB")
prompt = "What is unusual about this image?"
inputs = processor(images=image, text=prompt, return_tensors="pt")

outputs = model.generate(
        **inputs,
        do_sample=False,
        num_beams=5,
        max_length=256,
        min_length=1,
        top_p=0.9,
        repetition_penalty=1.5,
        length_penalty=1.0,
        temperature=1,
)
generated_text = processor.batch_decode(outputs, skip_special_tokens=True)[0].strip()
print(generated_text)

Hi,
I run this code and get the warning: Some weights of the model checkpoint were not used when initializing Blip2ForConditionalGeneration, and it lists some parameters in QFormer.

Is this normal? thanks!

Hey,

Even after doing doing pip install --upgrade git+https://github.com/NielsRogge/transformers.git@add_instruct_blip and running "from transformers import InstructBlipProcessor" I'm still getting cannot import name 'InstructBlipProcessor' from 'transformers". How do I fix it?

Since this is merged now, I'll closed this one.

nielsr changed discussion status to closed

Sign up or log in to comment