|
--- |
|
license: apache-2.0 |
|
--- |
|
# Model Card for Jennny/flan-t5-base-summarization-bb |
|
|
|
This repository contains the LoRA of `flan-t5-base` that has been fine-tuned on [`knkarthick/dialogsum`](https://huggingface.co/datasets/knkarthick/dialogsum) dataset. |
|
|
|
## Usage |
|
|
|
Use this adapter with `peft` library |
|
|
|
```python |
|
# pip install peft transformers |
|
import torch |
|
from peft import PeftModel |
|
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer |
|
|
|
base_model_id = "google/flan-t5-base" |
|
peft_model_id = "Jennny/flan-t5-base-summarization-bb" |
|
|
|
# Load tokenizer and models |
|
tokenizer = AutoTokenizer.from_pretrained(base_model_id) |
|
base_model = AutoModelForSeq2SeqLM.from_pretrained( |
|
base_model_id, torch_dtype=torch.bfloat16 |
|
).to(device) |
|
model = PeftModel.from_pretrained( |
|
base_model_id, |
|
peft_model_id, |
|
torch_dtype=torch.bfloat16, |
|
is_trainable=False, |
|
).to(device) |
|
``` |
|
|