ZeroCool94 commited on
Commit
f8d2192
1 Parent(s): 4a62b04

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -4
README.md CHANGED
@@ -87,9 +87,10 @@ image.save("fantasy_forest_illustration.png")
87
  - #### Stable:
88
  - [Sygil Diffusion v0.1](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.1.ckpt): Trained on Stable Diffusion 1.5 for 800,000 steps.
89
  - [Sygil Diffusion v0.2](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.2.ckpt): Resumed from Sygil Diffusion v0.1 and trained for a total of 1.77 million steps.
90
- - [Sygil Diffusion v0.3](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.3.ckpt): Resumed from Sygil Diffusion v0.2 and trained for a total of 2.01 million steps so far.
 
91
  - #### Beta:
92
- - [sygil-diffusion-v0.4_2318263_lora.ckptt](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.4_2318263_lora.ckpt): Resumed from Sygil Diffusion v0.3 and trained for a total of 2.31 million steps so far.
93
 
94
  Note: Checkpoints under the Beta section are updated daily or at least 3-4 times a week. This is usually the equivalent of 1-2 training session,
95
  this is done until they are stable enough to be moved into a proper release, usually every 1 or 2 weeks.
@@ -105,7 +106,7 @@ The model was trained on the following dataset:
105
 
106
  **Hardware and others**
107
  - **Hardware:** 1 x Nvidia RTX 3050 8GB GPU
108
- - **Hours Trained:** 840 hours approximately.
109
  - **Optimizer:** AdamW
110
  - **Adam Beta 1**: 0.9
111
  - **Adam Beta 2**: 0.999
@@ -120,7 +121,7 @@ The model was trained on the following dataset:
120
  - **Lora unet Learning Rate**: 1e-7
121
  - **Lora Text Encoder Learning Rate**: 1e-7
122
  - **Resolution**: 512 pixels
123
- - **Total Training Steps:** 2,318,263
124
 
125
 
126
  Note: For the learning rate I'm testing something new, after changing from using the `constant` scheduler to `cosine_with_restarts` after v0.3 was released, I noticed
 
87
  - #### Stable:
88
  - [Sygil Diffusion v0.1](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.1.ckpt): Trained on Stable Diffusion 1.5 for 800,000 steps.
89
  - [Sygil Diffusion v0.2](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.2.ckpt): Resumed from Sygil Diffusion v0.1 and trained for a total of 1.77 million steps.
90
+ - [Sygil Diffusion v0.3](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.3.ckpt): Resumed from Sygil Diffusion v0.2 and trained for a total of 2.01 million steps.
91
+ - [Sygil Diffusion v0.4](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.4.ckpt): Resumed from Sygil Diffusion v0.3 and trained for a total of 2.37 million steps.
92
  - #### Beta:
93
+ - [sygil-diffusion-v0.4_2370200_lora.ckpt](https://huggingface.co/Sygil/Sygil-Diffusion/blob/main/sygil-diffusion-v0.4.ckpt): Resumed from Sygil Diffusion v0.3 and trained for a total of 2.37 million steps so far.
94
 
95
  Note: Checkpoints under the Beta section are updated daily or at least 3-4 times a week. This is usually the equivalent of 1-2 training session,
96
  this is done until they are stable enough to be moved into a proper release, usually every 1 or 2 weeks.
 
106
 
107
  **Hardware and others**
108
  - **Hardware:** 1 x Nvidia RTX 3050 8GB GPU
109
+ - **Hours Trained:** 857 hours approximately.
110
  - **Optimizer:** AdamW
111
  - **Adam Beta 1**: 0.9
112
  - **Adam Beta 2**: 0.999
 
121
  - **Lora unet Learning Rate**: 1e-7
122
  - **Lora Text Encoder Learning Rate**: 1e-7
123
  - **Resolution**: 512 pixels
124
+ - **Total Training Steps:** 2,370,200
125
 
126
 
127
  Note: For the learning rate I'm testing something new, after changing from using the `constant` scheduler to `cosine_with_restarts` after v0.3 was released, I noticed