Base model :

  • google/paligemma-3b-pt-224

Dataset :

  • HuggingFaceM4/VQAv2

Getting started :

from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM

config = PeftConfig.from_pretrained("ayoubkirouane/PaliGemma-VQAv2-Lora-finetuned")
base_model = AutoModelForCausalLM.from_pretrained("google/paligemma-3b-pt-224")
model = PeftModel.from_pretrained(base_model, "ayoubkirouane/PaliGemma-VQAv2-Lora-finetuned")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Inference API (serverless) does not yet support transformers models for this pipeline type.

Dataset used to train ayoubkirouane/PaliGemma-VQAv2-Lora-finetuned