Edit model card

Load the tokenizer, model, and data collator

MODEL_NAME = "google/flan-t5-large"

model = AutoModelForSeq2SeqLM.from_pretrained(model_name)

tokenizer = AutoTokenizer.from_pretrained(model_name)

data_collator = DataCollatorWithPadding(tokenizer=tokenizer) # do padding automatic according size of input_str

Configuração do LoRA

lora_config = LoraConfig(

task_type=TaskType.SEQ_2_SEQ_LM,

r=16,  # Exemplo de quantização para 8 bits

lora_alpha=32,

bias='none',

target_modules=['q', 'v'],  # Exemplo de módulos alvo para quantização

lora_dropout=0.01,

modules_to_save=['lm_head'],

)

Aplicar LoRA ao modelo base

model = get_peft_model(model, lora_config)

Configuração inicial do treinamento

num_train = 10500

num_valid = 1300

num_epochs = 10

max_length = 256

batch_size = 2

absa_10_domains_large.png

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Dataset used to train SilvioLima/absa_10_domains_large_lora