Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
alvdansen 
posted an update Jun 18
Post
3247
Hey All!

I've been asked a lot of share more on how I train LoRAs. The truth is I don't think my advice is very helpful without also including more contextual, theoretical commentary on how I **think** about training LoRAs for SDXL and other models.

I wrote a first article here about it - let me know what you think.

https://huggingface.co/blog/alvdansen/thoughts-on-lora-training-1

Edit: Also people kept asking where to start so I made a list of possible resources:
https://huggingface.co/blog/alvdansen/thoughts-on-lora-training-pt-2-training-services

Hello! I found your lora in face to all, one of the best. But I can't adapt your other loras for Face to All, they don't work properly. I'm new and maybe I don't understand what to do.

·

I will need to take a look at what the exact backend of face to all is. What is the result you’re getting ?

Just to be sure, since you don't mention any hyperparameters in your post does that mean that you are always using the default ones?

·

No - I change them however it’s very case by case. I am trying to emphasize elements other than hyperparameters, because in my experience these concepts apply to a variety of hyperparameters.

What is your favorite and most used LoRA Trainer tool?