Trained for 99 epochs and 2100 steps.
Browse filesTrained with datasets ['an example backend for text embeds.', 'carbman1']
Learning rate 0.0005, batch size 1, and 1 gradient accumulation steps.
Used DDPM noise scheduler for training with epsilon prediction type and rescaled_betas_zero_snr=False
Using 'trailing' timestep spacing.
Base model: sayakpaul/FLUX.1-merged
VAE: None
pytorch_lora_weights.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 104667792
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:81fe0bfa35311c145a2b61fa0da2e92d584d191aa194e30ac055ec51ff49e5a8
|
3 |
size 104667792
|