dahara1's picture
Update finetune_sample/README.md
b1e8a0e
|
raw
history blame
No virus
994 Bytes

finetune sample data and script.

pip install transformers
pip install datasets
pip install peft==0.5.0
pip install trl
pip install auto-gptq
pip install optimum

Version is very very important.
For example if you get something like

ValueError: Target module QuantLinear() is not supported. Currently, only `torch.nn.Linear` and `Conv1D` are supported.

It's because peft old version.

I don't know if it's required, but the version of my running environment.

  • auto-gptq 0.4.1+cu117
  • trl 0.7.1
  • optimum 1.12.1.dev0
  • pip install transformers 4.32.1
  • datasets 2.14.4

The documentation says to install from source, but sometimes that causes errors.
If you can't get it to work, it might be better to wait until the stable version comes out.
Good luck!

  • finetune.py gptq finetune sample file.
  • jawiki3.csv sample data.(Japanese)
  • lora_test.py after finetune, you can use lora with this script.