How to fine tune ?

#12
by NickyNicky - opened

same title ^|

same ques, i want to explore an application of these 7B models that i have been thinking about for quite some time

looking for same

still experimenting with colab finetuning...

looking for same here as well. so far its amazing...

Seems to be mostly working with the default qlora.py script with one small change.

This line was causing trouble because mistral's model.config.pad_token_id returns None:

"unk_token": tokenizer.convert_ids_to_tokens(
     model.config.pad_token_id if model.config.pad_token_id != -1 else tokenizer.pad_token_id
),

Adding a None check seems to fix:

"unk_token": tokenizer.convert_ids_to_tokens(
    model.config.pad_token_id if 
    model.config.pad_token_id is not None and model.config.pad_token_id != -1 
    else tokenizer.pad_token_id
),

thx. i am not even sure yet if it needs fine tuning, we are running a bunch of tests on it.

it seems to be the best 7B model we have ever seen. it may outperform 13B and possibly 70B models for certain use cases.

it's very fast.

use:
-When flash attention 2 is used and with an inference of approximately 600 tokens <20 seconds if I remember correctly.
-Without flash attention 2, inferences from 100 tokens take +440 seconds.

fine-tune (SFTTrainer):
-gpu a100 colab 40GB
-for a 15k dataset it takes approximately 1H
-14 credits 1H
-per_device_train_batch_size= 6 ---->>> use gpu 36GB

Anyone made a colab notebook for finetuning?

Seems to be mostly working with the default qlora.py script with one small change.

using this script and your change seems to work but I had to repull in the latest transformers version 4.34 to get it to work

Seems to be mostly working with the default qlora.py script with one small change.

using this script and your change seems to work but I had to repull in the latest transformers version 4.34 to get it to work

I also had to repull transformers with
β€œ
pip install git+https://github.com/huggingface/transformers
β€œ

Step-by-step guide to finetune on QA Dataset: https://medium.com/me/stats/post/0f7bebccf11c

haha error 404

Sign up or log in to comment