Trained for 120 epochs and 3000 steps.
Browse filesTrained with datasets ['text-embeds', 'ohwx']
Learning rate 0.0001, batch size 2, and 2 gradient accumulation steps.
Used DDPM noise scheduler for training with epsilon prediction type and rescaled_betas_zero_snr=False
Using 'trailing' timestep spacing.
Base model: black-forest-labs/FLUX.1-dev
VAE: None
pytorch_lora_weights.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a7ec2c0a69a82e04cc15c0dc2233072cc204491276d1272a6bbb72d0de100280
|
3 |
+
size 523493168
|