Were special tokens trained?

#71
by Tron2060 - opened

I tried to use LoRA to finetune the model and add special tokens into the model, but after I finetuned the model. The model output looks weird, this may be caused by special tokens is not trained, so the token embedding can not be handled well by the model.

Sign up or log in to comment