In the original paper, have used opt-2.7b and flan-t5-xxl as a large language model.Would like to train using different language models like BERT, or GPT2. Is it possible to use the Blip2ForConditionalGeneration class in the transformer library?
· Sign up or log in to comment