Model Card for LoRA-FLAN-T5 large

model image

This repository contains the LoRA (Low Rank Adapters) of flan-t5-large that has been fine-tuned on financial_phrasebank dataset.

Usage

Use this adapter with peft library

# pip install peft transformers
import torch
from peft import PeftModel, PeftConfig
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

peft_model_id = "ybelkada/flan-t5-large-financial-phrasebank-lora"
config = PeftConfig.from_pretrained(peft_model_id)

model = AutoModelForSeq2SeqLM.from_pretrained(
    config.base_model_name_or_path, 
    torch_dtype='auto', 
    device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path)

# Load the Lora model
model = PeftModel.from_pretrained(model, peft_model_id)

Enjoy!

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
Unable to determine this model's library. Check the docs .

Datasets used to train ybelkada/flan-t5-large-financial-phrasebank-lora