finetuning

#1
by almugabo - opened

Hi,
I have a short question .
How did you fine-tuned it ? could you please share the scripts or a link where I can find more information ?

Hi, @ahmed000000000 Just curious
mine fine tuning qlora of phi1.5 show the message of attenion_mask not supported during fine tuning

`attention_mask` is not supported during training. Using it might lead to unexpected results.

why there's no this kind of message in our ipynb ??

secondly, if they do not support attention_mask, when we set

tokenizer.pad_token = tokenizer.eos_token

do we need to change the padding_side.
I mean, if padding_side = 'left'as some other tutorial suggestion,
the input_ids will become EOS EOS EOS EOS EOS Below is the instruction, blab la bla bla.....
the model will be trained on predicting a lot of EOS token in the beginning of sentence. wouldn't that somewhat weird ?

Sign up or log in to comment