Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
dataautogpt3
/
dpo-sdxl-merged
like
12
License:
mit
Model card
Files
Files and versions
Community
1
dataautogpt3
commited on
Dec 19, 2023
Commit
44c8538
•
1 Parent(s):
eee04bd
Update README.md
Browse files
Files changed (1)
hide
show
README.md
+3
-0
README.md
CHANGED
Viewed
@@ -1,3 +1,6 @@
1
---
2
license: mit
3
---
1
---
2
license: mit
3
---
4
+
Juggernaut 7XL and DPO Merge to safetensors format.
5
+
6
+
unet is DPO and vae and clip are from Juggernaut 7XL.