Nazzaroth2 commited on
Commit
6e13727
1 Parent(s): be7c95b

Update README.md

Browse files

Initial description

Files changed (1) hide show
  1. README.md +82 -0
README.md CHANGED
@@ -1,3 +1,85 @@
1
  ---
2
  license: cc-by-nc-4.0
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-nc-4.0
3
+ language:
4
+ - en
5
+ tags:
6
+ - art
7
+ - texture
8
+ - game-development
9
+ - asset-creation
10
+ - pbr
11
+ - stable_diffusion
12
  ---
13
+ # Introducing Texture Hell
14
+
15
+ This is my first publicly released sd model. It is meant specifically to create albedo/diffuse textures for use in video games and animations. It has been fully finetuned on the sd2.1 768 base model to provide you with high resolution results. Let's dive into the details.
16
+
17
+ Please read my description first before using this model, as it is rather diffrent to use than the usual sd models.
18
+
19
+ ## Training and prompt tips
20
+
21
+ The dataset consists of 350 high-quality images sourced from Poly Heaven, ensuring a diverse and representative range of albedo textures.
22
+ Clip 1 was used throughout training.
23
+ Prompts where comma seperated tag lists (no clip/bert full text descriptions). The most import one is "texture", followed by any material you are interested. Though because of the limited dataset, there are lacking domains (e.g. flesh and cloth). I plan to improve on diversity on subsequent versions. A full list of tags used on images with a frequency usage statistic is available -here- to help you identify potentially interesting tags. But there is probably a lot of room to experiment with other words, i was only able to do rudimentary testing.
24
+
25
+ Very Important: there are some aerial textures inside the dataset (ie. beach taken by drone from very far up) for landscape textures. If you don't want this type of texture, put "aerial" into negative prompt.
26
+
27
+ All example images have full prompt information inside them for more information about samplers, steps etc.
28
+
29
+ ## Tiling and WebUI
30
+
31
+ Please note that the textures generated do not perfectly tile. While there is a tiling option available in the WebUI, i do not recommend using it. Instead, i have found a "reasonable" manual workflow to turn them into seamless textures. A tutorial can be found on my Huggingface page:
32
+ https://huggingface.co/Nazzaroth2/texture-hell
33
+
34
+ I hope to improve the inherent tiling ability of the model in further versions, especially a bigger dataset and good tagging should help here.
35
+
36
+ ## Other Textures
37
+
38
+ My first try involded generating the full set of textures needed for the normal pbr-based render pipeline(normal, height, roughness, etc). The initial results are promising, though the reduction in resolution and limited dataset made me change my plans. This might change in the future.
39
+
40
+ In the meantime you can use other software to create these textures from the albedo image. I personally used the free option Materialize https://boundingboxsoftware.com/materialize . But there are of course also paid versions like Substance Designer.
41
+
42
+ I am curious to see what you all are able to make with this model and hope it can help you in your projects. If you have any ideas for improvements or a specific material type you think is lacking inside the model, i am happy to hear from you.
43
+
44
+ Happy texturing!
45
+
46
+
47
+ P.S. Why Texture Hell? Because hell is way more interesting than boring old heaven :3
48
+
49
+
50
+
51
+
52
+ # How to Create Seamless Textures
53
+
54
+ ## Additional Software used
55
+ - auto111 webui
56
+ - image editing software (photoshop or gimp)
57
+ - lama cleaner (https://github.com/Sanster/lama-cleaner)
58
+
59
+ ## Step 1: Create the Texture
60
+ Create your desired texture with the usual text2image workflow. Do NOT use tiling option yet.
61
+
62
+ ## Step 2: Inpaint the Texture
63
+ 1. Send the image to the inpainting tab.
64
+ 2. Handpaint a mask around the edges or use one of my pre-made mask images
65
+ 3. Now activate the tiling option
66
+ 4. Adjust the denoising strength. A value of 0.75 is a good starting point, but feel free to experiment as needed.
67
+ 5. Save the inpainted image.
68
+
69
+ ## Step 3: Offset the Edges in an Image Editor
70
+ 1. Open the inpainted image in the image editor of your choice.
71
+ 2. Use the offset modifier to push the edges into the center of the image (check online tutorials for specific instructions on using the offset modifier in your chosen image editor).
72
+ 3. Save the modified image.
73
+
74
+ ## Step 4: Clean Up the Texture with LAMA-cleaner
75
+ 1. Open the offset image in LAMA-cleaner.
76
+ 2. Paint over parts that do not line up perfectly or remove any larger objects that cannot be fixed. Lama or Zits are my preffered models, but you can also experiment here.
77
+ 3. Save the cleaned image.
78
+
79
+ ## Step 5: Additional Cleanup (Optional)
80
+ Depending on your quality requirements, you may want to open the cleaned image in your image editor again and further clean up any imperfections. Continue this process until you are satisfied with the results.
81
+
82
+ ## Step 6: Restore the Original Texture
83
+ 1. Open the cleaned image in your image editor.
84
+ 2. Use the offset modifier again to return the image to its original state. The texture should now have seamless edges.
85
+ 3. Save the final seamless texture and use other software like materialize https://boundingboxsoftware.com/materialize/ to create missing textures like normal-map and roughness