34 hour for file tunning ?

#7
by dad1909 - opened

Best optimize for 34 hour for file tunning ?
Only 1000 steps.

Unsloth AI org

Sorry I'm confused by your question, can you specifiy what you mean?

Using max_steps = 200 require more than 2 hours for file-tunning
i think your unsloth library not working as expectation
Moreover, your library can't using in zero GPU.

@dad1909 the time taken for fine tuning depends on the dataset being used to finetune. The larger and complex the dataset, the higher the time taken.
Unsloth works as expected, to get a better understanding you can try to use the native Huggingface TRansformer finetuning example script, on the same dataset, to get an idea of the speedup obtained.

Unsloth AI org

Thanks @ewre324 ! Sadly @dad1909 this is as expected - ye it depend on the dataset, the config you used etc - also if your GPU might not have tensor cores etc - use free GPUs via Colab, Kaggle - we have many examples on our Github page - then export them to GGUF / vLLM for your own local inference usecases

Sign up or log in to comment