RMHF / RMHF-Anime-V2 /rmhf.md
TkskKurumi's picture
Upload 2 files
f92fb63
This is a merged model of following models. Huge thanks for these amazing model creators.
+ [RMHF - 2.5D-V2](https://civitai.com/models/101518)
+ [RMHF - AnimeV1](https://civitai.com/models/101518?modelVersionId=109075)
+ [MeinaPastel - V6](https://civitai.com/models/11866/meinapastel)
+ [MeinaMix - V10](https://civitai.com/models/7240?modelVersionId=80511)
+ [CuteYukiMix - EchoDimension](https://civitai.com/models/28169/cuteyukimixadorable-style)
+ [ToonYou - Beta5Unstable](https://civitai.com/models/30240?modelVersionId=102996)
+ [RealCartoon-Anime - V3](https://civitai.com/models/96629/realcartoon-anime)
+ [Fantexi - V0.9 Beta](https://civitai.com/models/18427?modelVersionId=95199)
| unet | RMHF - 2.5D-V2 | RMHF - AnimeV1 | MeinaPastel - V6 | MeinaMix - V10 | CuteYukiMix - EchoDimension | ToonYou - Beta5Unstable | RealCartoon-Anime - V3 | Fantexi - V0.9 Beta |
| - | - | - | - | - | - | - | - | - |
| unet.conv_in.weight | 0.00% | 0.00% | 0.00% | 99.93% | 0.00% | 0.03% | 0.00% | 0.03% |
| unet.conv_in.bias | 0.01% | 0.02% | 0.00% | 0.01% | 0.00% | 99.94% | 0.02% | 0.00% |
| unet.time_embedding.linear_1.weight | 0.14% | 0.00% | 0.00% | 2.29% | 0.07% | 63.99% | 33.49% | 0.02% |
| unet.time_embedding.linear_1.bias | 0.00% | 0.00% | 0.00% | 1.59% | 0.34% | 78.27% | 19.79% | 0.00% |
| unet.time_embedding.linear_2.weight | 2.29% | 2.25% | 0.00% | 36.84% | 0.41% | 48.26% | 7.75% | 2.18% |
| unet.time_embedding.linear_2.bias | 0.00% | 0.86% | 0.00% | 12.90% | 0.00% | 0.03% | 86.21% | 0.00% |
| unet.down_blocks.0.attentions.0.norm.weight | 0.02% | 2.51% | 0.00% | 1.50% | 0.02% | 95.95% | 0.01% | 0.00% |
| unet.down_blocks.0.attentions.0.norm.bias | 0.00% | 51.42% | 1.98% | 46.59% | 0.01% | 0.00% | 0.00% | 0.00% |
| unet.down_blocks.0.attentions.0.proj_in.weight | 0.00% | 0.00% | 5.18% | 22.10% | 0.07% | 52.54% | 20.08% | 0.02% |
| unet.down_blocks.0.attentions.0.proj_in.bias | 0.04% | 0.01% | 0.00% | 0.04% | 4.04% | 69.30% | 26.57% | 0.00% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.weight | 0.00% | 0.00% | 0.00% | 75.31% | 0.00% | 24.08% | 0.45% | 0.15% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.weight | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 58.44% | 0.00% | 41.55% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.weight | 0.00% | 0.00% | 0.00% | 94.30% | 0.00% | 5.47% | 0.17% | 0.07% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.weight | 0.01% | 0.00% | 0.00% | 96.73% | 0.00% | 3.25% | 0.00% | 0.00% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.bias | 0.80% | 0.00% | 0.00% | 0.00% | 97.63% | 0.01% | 0.00% | 1.56% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.ff.net.0.proj.weight | 0.00% | 8.05% | 0.00% | 0.00% | 91.80% | 0.10% | 0.01% | 0.03% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.ff.net.0.proj.bias | 0.00% | 99.99% | 0.00% | 0.00% | 0.00% | 0.01% | 0.00% | 0.00% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.ff.net.2.weight | 0.00% | 0.00% | 78.97% | 19.86% | 0.00% | 0.00% | 1.17% | 0.00% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.ff.net.2.bias | 0.00% | 0.05% | 0.00% | 0.00% | 0.00% | 97.26% | 0.00% | 2.68% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.weight | 0.00% | 1.70% | 0.51% | 67.79% | 27.01% | 2.97% | 0.00% | 0.02% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.weight | 0.00% | 0.01% | 0.00% | 79.13% | 0.00% | 8.29% | 0.00% | 12.56% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 13.48% | 86.38% | 0.14% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 0.01% | 0.00% | 81.95% | 0.02% | 0.06% | 0.47% | 17.48% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.bias | 0.05% | 0.00% | 0.00% | 0.19% | 98.68% | 1.08% | 0.00% | 0.00% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.norm1.weight | 0.00% | 0.00% | 0.00% | 10.64% | 0.00% | 89.30% | 0.00% | 0.06% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.norm1.bias | 0.00% | 0.00% | 0.00% | 0.05% | 1.16% | 98.61% | 0.18% | 0.00% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.norm2.weight | 0.00% | 0.65% | 0.00% | 99.33% | 0.00% | 0.00% | 0.00% | 0.01% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.norm2.bias | 0.00% | 0.01% | 0.01% | 1.83% | 0.05% | 0.00% | 91.10% | 6.99% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.norm3.weight | 0.00% | 0.00% | 0.01% | 0.01% | 0.00% | 0.00% | 2.35% | 97.62% |
| unet.down_blocks.0.attentions.0.transformer_blocks.0.norm3.bias | 0.00% | 0.00% | 0.00% | 36.74% | 0.00% | 6.52% | 56.74% | 0.00% |
| unet.down_blocks.0.attentions.0.proj_out.weight | 0.00% | 0.00% | 0.00% | 0.17% | 0.00% | 0.00% | 0.00% | 99.82% |
| unet.down_blocks.0.attentions.0.proj_out.bias | 0.00% | 0.00% | 9.92% | 88.84% | 0.00% | 0.00% | 0.03% | 1.22% |
| unet.down_blocks.0.attentions.1.norm.weight | 0.00% | 4.72% | 0.00% | 94.38% | 0.11% | 0.05% | 0.67% | 0.07% |
| unet.down_blocks.0.attentions.1.norm.bias | 0.00% | 0.00% | 0.02% | 98.14% | 0.00% | 1.84% | 0.00% | 0.00% |
| unet.down_blocks.0.attentions.1.proj_in.weight | 0.28% | 3.77% | 0.00% | 0.03% | 0.00% | 0.00% | 95.85% | 0.06% |
| unet.down_blocks.0.attentions.1.proj_in.bias | 0.00% | 0.00% | 57.93% | 0.00% | 0.00% | 38.04% | 3.61% | 0.41% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.weight | 0.00% | 0.00% | 0.00% | 0.00% | 0.17% | 0.23% | 0.00% | 99.60% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.weight | 1.08% | 0.00% | 0.00% | 0.02% | 0.00% | 98.89% | 0.00% | 0.01% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.weight | 1.33% | 4.05% | 0.00% | 0.01% | 0.30% | 94.18% | 0.00% | 0.14% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.weight | 0.00% | 0.01% | 0.00% | 0.00% | 0.06% | 79.83% | 20.10% | 0.00% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.bias | 0.00% | 0.00% | 0.11% | 38.90% | 29.82% | 0.16% | 0.00% | 31.00% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.ff.net.0.proj.weight | 0.05% | 0.01% | 0.00% | 0.04% | 0.00% | 9.32% | 85.34% | 5.25% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.ff.net.0.proj.bias | 0.02% | 0.00% | 0.00% | 0.00% | 0.05% | 99.93% | 0.00% | 0.00% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.ff.net.2.weight | 0.13% | 0.00% | 0.00% | 0.06% | 0.02% | 9.87% | 89.92% | 0.00% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.ff.net.2.bias | 4.47% | 0.20% | 0.00% | 11.53% | 0.00% | 1.03% | 0.00% | 82.78% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.weight | 0.00% | 55.98% | 0.00% | 0.00% | 33.25% | 0.00% | 10.72% | 0.04% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.weight | 2.84% | 0.03% | 0.00% | 9.01% | 19.19% | 0.00% | 0.44% | 68.49% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.17% | 2.71% | 0.00% | 0.00% | 4.27% | 92.84% | 0.00% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 79.52% | 0.00% | 0.00% | 0.00% | 12.90% | 0.20% | 7.37% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.bias | 0.00% | 0.49% | 0.00% | 0.00% | 99.48% | 0.00% | 0.02% | 0.00% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.norm1.weight | 0.99% | 0.00% | 0.00% | 0.06% | 0.00% | 0.30% | 98.65% | 0.00% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.norm1.bias | 14.99% | 0.00% | 0.01% | 84.64% | 0.00% | 0.21% | 0.02% | 0.14% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.norm2.weight | 0.00% | 4.05% | 0.00% | 0.00% | 23.41% | 16.20% | 0.38% | 55.95% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.norm2.bias | 0.01% | 89.91% | 0.02% | 5.71% | 0.00% | 0.02% | 4.34% | 0.00% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.norm3.weight | 0.00% | 96.66% | 0.05% | 0.01% | 0.15% | 0.00% | 0.04% | 3.09% |
| unet.down_blocks.0.attentions.1.transformer_blocks.0.norm3.bias | 0.00% | 0.19% | 0.00% | 0.00% | 0.00% | 32.91% | 66.90% | 0.00% |
| unet.down_blocks.0.attentions.1.proj_out.weight | 52.53% | 21.57% | 0.00% | 24.55% | 0.03% | 0.04% | 1.29% | 0.00% |
| unet.down_blocks.0.attentions.1.proj_out.bias | 0.00% | 4.54% | 0.00% | 2.04% | 5.58% | 0.00% | 84.01% | 3.83% |
| unet.down_blocks.0.resnets.0.norm1.weight | 83.60% | 2.37% | 0.00% | 0.00% | 0.00% | 14.01% | 0.01% | 0.01% |
| unet.down_blocks.0.resnets.0.norm1.bias | 0.01% | 3.94% | 2.30% | 0.00% | 0.04% | 24.81% | 68.89% | 0.00% |
| unet.down_blocks.0.resnets.0.conv1.weight | 0.00% | 0.01% | 2.45% | 86.44% | 0.00% | 0.20% | 10.89% | 0.01% |
| unet.down_blocks.0.resnets.0.conv1.bias | 0.00% | 0.14% | 0.48% | 25.32% | 0.36% | 1.73% | 71.97% | 0.00% |
| unet.down_blocks.0.resnets.0.time_emb_proj.weight | 0.00% | 0.00% | 1.12% | 13.10% | 2.51% | 0.01% | 0.00% | 83.26% |
| unet.down_blocks.0.resnets.0.time_emb_proj.bias | 0.05% | 0.00% | 0.00% | 39.35% | 41.45% | 3.87% | 15.28% | 0.00% |
| unet.down_blocks.0.resnets.0.norm2.weight | 0.86% | 0.02% | 0.00% | 0.02% | 29.32% | 69.73% | 0.05% | 0.00% |
| unet.down_blocks.0.resnets.0.norm2.bias | 0.04% | 0.00% | 0.01% | 0.26% | 17.73% | 0.82% | 0.05% | 81.09% |
| unet.down_blocks.0.resnets.0.conv2.weight | 0.00% | 38.47% | 0.00% | 60.86% | 0.00% | 0.03% | 0.65% | 0.00% |
| unet.down_blocks.0.resnets.0.conv2.bias | 0.00% | 0.00% | 0.03% | 0.09% | 0.03% | 0.00% | 99.85% | 0.00% |
| unet.down_blocks.0.resnets.1.norm1.weight | 0.00% | 0.00% | 0.00% | 0.09% | 0.00% | 0.02% | 99.89% | 0.00% |
| unet.down_blocks.0.resnets.1.norm1.bias | 0.00% | 26.30% | 0.04% | 0.00% | 7.35% | 0.14% | 0.00% | 66.16% |
| unet.down_blocks.0.resnets.1.conv1.weight | 0.00% | 67.68% | 0.01% | 0.00% | 31.64% | 0.02% | 0.61% | 0.03% |
| unet.down_blocks.0.resnets.1.conv1.bias | 4.75% | 0.00% | 0.11% | 93.91% | 0.00% | 0.00% | 1.23% | 0.00% |
| unet.down_blocks.0.resnets.1.time_emb_proj.weight | 13.09% | 71.75% | 0.16% | 0.91% | 0.11% | 4.25% | 9.73% | 0.00% |
| unet.down_blocks.0.resnets.1.time_emb_proj.bias | 0.07% | 0.00% | 0.03% | 96.88% | 0.06% | 0.00% | 2.95% | 0.00% |
| unet.down_blocks.0.resnets.1.norm2.weight | 0.01% | 0.01% | 32.60% | 45.78% | 0.35% | 0.00% | 13.82% | 7.43% |
| unet.down_blocks.0.resnets.1.norm2.bias | 0.40% | 0.00% | 3.23% | 0.00% | 0.23% | 95.39% | 0.01% | 0.75% |
| unet.down_blocks.0.resnets.1.conv2.weight | 98.23% | 0.00% | 0.00% | 0.00% | 0.02% | 1.73% | 0.00% | 0.01% |
| unet.down_blocks.0.resnets.1.conv2.bias | 0.02% | 0.28% | 0.00% | 0.00% | 0.38% | 99.33% | 0.00% | 0.00% |
| unet.down_blocks.0.downsamplers.0.conv.weight | 4.26% | 0.11% | 0.00% | 95.10% | 0.00% | 0.01% | 0.52% | 0.00% |
| unet.down_blocks.0.downsamplers.0.conv.bias | 0.00% | 0.01% | 8.33% | 0.00% | 0.00% | 83.49% | 8.16% | 0.01% |
| unet.down_blocks.1.attentions.0.norm.weight | 0.35% | 0.00% | 0.02% | 62.22% | 37.40% | 0.00% | 0.01% | 0.00% |
| unet.down_blocks.1.attentions.0.norm.bias | 0.00% | 0.56% | 93.52% | 0.21% | 3.36% | 2.33% | 0.02% | 0.00% |
| unet.down_blocks.1.attentions.0.proj_in.weight | 0.01% | 0.01% | 0.00% | 0.00% | 0.38% | 0.06% | 0.92% | 98.62% |
| unet.down_blocks.1.attentions.0.proj_in.bias | 53.29% | 0.01% | 0.00% | 0.01% | 0.00% | 0.00% | 46.69% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.weight | 0.04% | 0.01% | 0.00% | 0.24% | 76.50% | 0.00% | 23.20% | 0.01% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.weight | 0.02% | 0.00% | 0.00% | 0.00% | 42.66% | 0.00% | 57.33% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.weight | 0.00% | 0.10% | 0.38% | 0.00% | 0.00% | 99.51% | 0.00% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.weight | 19.93% | 0.00% | 0.02% | 21.98% | 0.03% | 1.38% | 56.58% | 0.08% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.bias | 4.69% | 0.00% | 0.00% | 0.00% | 61.73% | 31.78% | 1.79% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.ff.net.0.proj.weight | 0.01% | 5.63% | 0.00% | 0.00% | 93.94% | 0.00% | 0.41% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.ff.net.0.proj.bias | 25.46% | 0.00% | 0.00% | 0.00% | 0.00% | 35.48% | 38.45% | 0.61% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.ff.net.2.weight | 0.01% | 0.04% | 0.05% | 70.50% | 3.93% | 25.42% | 0.06% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.ff.net.2.bias | 0.00% | 0.00% | 0.00% | 2.32% | 0.00% | 5.44% | 92.24% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.weight | 0.00% | 0.00% | 0.07% | 2.56% | 0.00% | 0.00% | 97.37% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.weight | 0.00% | 0.00% | 0.00% | 0.00% | 15.23% | 0.01% | 84.76% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.00% | 0.00% | 44.56% | 0.52% | 0.00% | 24.04% | 30.87% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.weight | 0.23% | 0.06% | 0.00% | 1.54% | 98.16% | 0.00% | 0.00% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.bias | 81.64% | 0.00% | 0.00% | 0.00% | 0.22% | 0.03% | 18.11% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.norm1.weight | 0.00% | 0.00% | 3.90% | 89.09% | 5.15% | 1.84% | 0.02% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.norm1.bias | 99.68% | 0.00% | 0.00% | 0.00% | 0.05% | 0.02% | 0.26% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.norm2.weight | 0.03% | 0.00% | 39.08% | 0.02% | 60.61% | 0.00% | 0.26% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.norm2.bias | 0.00% | 0.00% | 0.00% | 76.94% | 1.17% | 0.88% | 21.00% | 0.01% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.norm3.weight | 0.00% | 1.53% | 8.92% | 0.06% | 0.00% | 89.47% | 0.02% | 0.00% |
| unet.down_blocks.1.attentions.0.transformer_blocks.0.norm3.bias | 69.24% | 0.01% | 0.00% | 0.00% | 0.00% | 4.29% | 0.03% | 26.43% |
| unet.down_blocks.1.attentions.0.proj_out.weight | 0.00% | 24.09% | 0.17% | 0.84% | 71.89% | 2.18% | 0.82% | 0.01% |
| unet.down_blocks.1.attentions.0.proj_out.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.01% | 0.00% | 0.00% | 99.98% |
| unet.down_blocks.1.attentions.1.norm.weight | 0.00% | 0.00% | 0.00% | 51.24% | 0.00% | 45.72% | 0.00% | 3.04% |
| unet.down_blocks.1.attentions.1.norm.bias | 0.30% | 3.21% | 0.00% | 7.29% | 0.00% | 89.19% | 0.01% | 0.00% |
| unet.down_blocks.1.attentions.1.proj_in.weight | 0.00% | 0.07% | 0.00% | 0.88% | 89.23% | 7.05% | 2.76% | 0.00% |
| unet.down_blocks.1.attentions.1.proj_in.bias | 0.20% | 0.01% | 0.01% | 99.21% | 0.00% | 0.56% | 0.00% | 0.00% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.weight | 0.04% | 0.01% | 0.44% | 0.00% | 0.00% | 0.00% | 0.00% | 99.52% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.weight | 6.03% | 0.00% | 0.00% | 0.05% | 0.02% | 0.00% | 93.84% | 0.06% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.weight | 98.44% | 0.00% | 0.00% | 0.13% | 0.23% | 0.51% | 0.69% | 0.00% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.weight | 0.00% | 0.08% | 3.90% | 0.00% | 0.00% | 14.08% | 0.00% | 81.95% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.bias | 0.00% | 92.06% | 0.00% | 0.00% | 0.00% | 0.62% | 6.33% | 0.98% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.ff.net.0.proj.weight | 0.00% | 2.53% | 0.00% | 0.00% | 13.12% | 53.45% | 0.00% | 30.90% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.ff.net.0.proj.bias | 0.00% | 2.24% | 0.00% | 0.00% | 0.72% | 69.66% | 27.27% | 0.11% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.ff.net.2.weight | 11.07% | 0.00% | 0.00% | 0.00% | 0.15% | 85.52% | 3.25% | 0.01% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.ff.net.2.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.02% | 0.20% | 99.75% | 0.03% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.weight | 0.78% | 14.11% | 0.00% | 43.87% | 0.00% | 14.58% | 26.66% | 0.00% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.weight | 0.00% | 0.00% | 0.00% | 80.95% | 19.03% | 0.02% | 0.00% | 0.00% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.00% | 0.00% | 63.10% | 0.00% | 36.90% | 0.00% | 0.00% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 0.00% | 0.01% | 0.00% | 0.00% | 0.00% | 99.83% | 0.16% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.bias | 7.80% | 0.12% | 0.04% | 39.21% | 15.96% | 0.65% | 35.99% | 0.23% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.norm1.weight | 36.82% | 9.72% | 0.02% | 0.01% | 9.53% | 0.01% | 40.31% | 3.58% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.norm1.bias | 0.00% | 0.22% | 0.00% | 27.83% | 8.72% | 26.87% | 36.13% | 0.23% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.norm2.weight | 80.70% | 0.89% | 0.00% | 0.00% | 2.69% | 15.32% | 0.41% | 0.00% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.norm2.bias | 57.69% | 0.01% | 0.00% | 0.70% | 0.01% | 41.53% | 0.00% | 0.06% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.norm3.weight | 0.40% | 0.00% | 0.00% | 0.04% | 0.15% | 0.35% | 0.01% | 99.04% |
| unet.down_blocks.1.attentions.1.transformer_blocks.0.norm3.bias | 0.00% | 0.00% | 0.00% | 0.00% | 99.99% | 0.01% | 0.01% | 0.00% |
| unet.down_blocks.1.attentions.1.proj_out.weight | 0.00% | 0.00% | 7.39% | 0.42% | 0.00% | 92.19% | 0.00% | 0.00% |
| unet.down_blocks.1.attentions.1.proj_out.bias | 1.64% | 0.30% | 0.03% | 24.25% | 0.00% | 73.75% | 0.02% | 0.00% |
| unet.down_blocks.1.resnets.0.norm1.weight | 1.30% | 0.00% | 0.00% | 0.00% | 0.00% | 0.04% | 98.65% | 0.00% |
| unet.down_blocks.1.resnets.0.norm1.bias | 35.86% | 0.00% | 0.02% | 15.49% | 0.10% | 31.09% | 17.44% | 0.00% |
| unet.down_blocks.1.resnets.0.conv1.weight | 0.00% | 80.12% | 0.07% | 1.05% | 1.83% | 16.85% | 0.10% | 0.00% |
| unet.down_blocks.1.resnets.0.conv1.bias | 0.00% | 0.58% | 0.00% | 0.00% | 68.33% | 0.00% | 0.12% | 30.96% |
| unet.down_blocks.1.resnets.0.time_emb_proj.weight | 0.00% | 0.05% | 0.00% | 22.41% | 0.00% | 0.00% | 50.73% | 26.80% |
| unet.down_blocks.1.resnets.0.time_emb_proj.bias | 0.00% | 0.00% | 0.03% | 0.00% | 0.00% | 0.09% | 99.88% | 0.00% |
| unet.down_blocks.1.resnets.0.norm2.weight | 0.00% | 0.00% | 0.00% | 68.52% | 5.43% | 12.51% | 13.53% | 0.00% |
| unet.down_blocks.1.resnets.0.norm2.bias | 0.00% | 50.06% | 0.00% | 10.97% | 0.00% | 0.20% | 38.76% | 0.02% |
| unet.down_blocks.1.resnets.0.conv2.weight | 0.06% | 12.26% | 0.00% | 10.41% | 75.84% | 0.00% | 1.42% | 0.00% |
| unet.down_blocks.1.resnets.0.conv2.bias | 0.00% | 0.00% | 0.01% | 0.08% | 77.73% | 21.98% | 0.00% | 0.21% |
| unet.down_blocks.1.resnets.0.conv_shortcut.weight | 26.58% | 0.02% | 0.00% | 61.85% | 0.00% | 1.93% | 0.00% | 9.63% |
| unet.down_blocks.1.resnets.0.conv_shortcut.bias | 0.00% | 0.00% | 0.00% | 38.88% | 0.00% | 38.42% | 22.70% | 0.00% |
| unet.down_blocks.1.resnets.1.norm1.weight | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 43.65% | 56.35% | 0.00% |
| unet.down_blocks.1.resnets.1.norm1.bias | 0.00% | 0.00% | 0.00% | 0.08% | 0.00% | 99.92% | 0.00% | 0.00% |
| unet.down_blocks.1.resnets.1.conv1.weight | 62.31% | 0.11% | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 37.58% |
| unet.down_blocks.1.resnets.1.conv1.bias | 0.00% | 76.25% | 0.60% | 0.01% | 0.00% | 0.01% | 0.31% | 22.82% |
| unet.down_blocks.1.resnets.1.time_emb_proj.weight | 0.00% | 0.00% | 0.00% | 16.95% | 0.00% | 0.00% | 83.04% | 0.01% |
| unet.down_blocks.1.resnets.1.time_emb_proj.bias | 0.00% | 34.03% | 0.00% | 0.00% | 65.96% | 0.00% | 0.01% | 0.00% |
| unet.down_blocks.1.resnets.1.norm2.weight | 0.04% | 84.97% | 0.00% | 0.00% | 14.68% | 0.30% | 0.00% | 0.00% |
| unet.down_blocks.1.resnets.1.norm2.bias | 0.00% | 41.08% | 0.00% | 0.01% | 21.24% | 0.01% | 37.66% | 0.00% |
| unet.down_blocks.1.resnets.1.conv2.weight | 0.00% | 0.00% | 0.02% | 0.37% | 0.00% | 93.32% | 0.08% | 6.21% |
| unet.down_blocks.1.resnets.1.conv2.bias | 0.73% | 40.19% | 0.00% | 0.00% | 19.03% | 0.04% | 39.93% | 0.08% |
| unet.down_blocks.1.downsamplers.0.conv.weight | 0.00% | 20.56% | 0.00% | 0.01% | 79.30% | 0.12% | 0.00% | 0.00% |
| unet.down_blocks.1.downsamplers.0.conv.bias | 0.77% | 38.54% | 0.00% | 16.24% | 1.31% | 18.17% | 0.58% | 24.38% |
| unet.down_blocks.2.attentions.0.norm.weight | 16.31% | 36.96% | 0.08% | 0.68% | 0.16% | 39.60% | 4.87% | 1.33% |
| unet.down_blocks.2.attentions.0.norm.bias | 82.19% | 0.03% | 0.00% | 0.04% | 0.00% | 17.64% | 0.10% | 0.00% |
| unet.down_blocks.2.attentions.0.proj_in.weight | 0.00% | 0.00% | 0.00% | 58.96% | 0.00% | 37.70% | 3.34% | 0.00% |
| unet.down_blocks.2.attentions.0.proj_in.bias | 0.00% | 0.03% | 0.00% | 0.01% | 99.60% | 0.00% | 0.36% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.weight | 29.75% | 0.71% | 0.00% | 0.00% | 0.00% | 0.00% | 55.18% | 14.37% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.weight | 0.01% | 0.00% | 0.00% | 40.49% | 0.00% | 0.00% | 59.49% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.weight | 0.14% | 0.00% | 0.00% | 0.19% | 0.01% | 34.90% | 64.76% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.weight | 12.62% | 0.01% | 0.01% | 37.22% | 0.00% | 0.06% | 23.81% | 26.28% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.bias | 0.00% | 0.00% | 0.00% | 98.95% | 0.00% | 0.00% | 1.05% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.ff.net.0.proj.weight | 0.00% | 97.58% | 0.00% | 0.27% | 0.00% | 0.03% | 0.25% | 1.87% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.ff.net.0.proj.bias | 99.88% | 0.02% | 0.01% | 0.02% | 0.00% | 0.01% | 0.05% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.ff.net.2.weight | 0.00% | 1.75% | 0.63% | 0.00% | 0.19% | 94.89% | 2.54% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.ff.net.2.bias | 0.00% | 2.08% | 0.00% | 0.07% | 0.01% | 15.06% | 82.78% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.weight | 0.00% | 62.81% | 23.43% | 0.03% | 0.00% | 13.73% | 0.00% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.weight | 0.00% | 0.00% | 0.04% | 99.92% | 0.01% | 0.00% | 0.00% | 0.02% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.weight | 0.29% | 0.02% | 0.00% | 4.51% | 9.23% | 0.06% | 85.89% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 0.31% | 10.06% | 0.02% | 0.00% | 1.41% | 88.15% | 0.06% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.bias | 1.10% | 0.00% | 0.00% | 6.60% | 3.45% | 0.18% | 88.67% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.norm1.weight | 0.00% | 17.70% | 0.00% | 32.63% | 0.42% | 47.71% | 1.53% | 0.02% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.norm1.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 47.19% | 52.81% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.norm2.weight | 0.00% | 0.00% | 0.00% | 3.82% | 0.35% | 0.30% | 95.53% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.norm2.bias | 0.11% | 0.00% | 0.06% | 0.00% | 0.01% | 0.08% | 98.71% | 1.03% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.norm3.weight | 0.00% | 0.37% | 0.00% | 0.00% | 99.18% | 0.00% | 0.45% | 0.00% |
| unet.down_blocks.2.attentions.0.transformer_blocks.0.norm3.bias | 0.00% | 0.00% | 5.01% | 1.77% | 53.88% | 11.90% | 27.44% | 0.00% |
| unet.down_blocks.2.attentions.0.proj_out.weight | 56.08% | 0.11% | 0.00% | 15.94% | 3.41% | 24.45% | 0.00% | 0.00% |
| unet.down_blocks.2.attentions.0.proj_out.bias | 0.00% | 0.00% | 0.00% | 98.70% | 0.01% | 1.26% | 0.03% | 0.00% |
| unet.down_blocks.2.attentions.1.norm.weight | 43.20% | 18.68% | 0.00% | 37.97% | 0.00% | 0.07% | 0.07% | 0.00% |
| unet.down_blocks.2.attentions.1.norm.bias | 0.00% | 27.22% | 0.00% | 0.00% | 61.68% | 11.10% | 0.00% | 0.00% |
| unet.down_blocks.2.attentions.1.proj_in.weight | 23.14% | 0.00% | 0.00% | 0.04% | 13.34% | 19.33% | 44.14% | 0.00% |
| unet.down_blocks.2.attentions.1.proj_in.bias | 0.01% | 0.00% | 0.00% | 99.76% | 0.00% | 0.13% | 0.01% | 0.09% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.weight | 0.00% | 12.67% | 0.00% | 0.01% | 87.28% | 0.00% | 0.05% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.weight | 0.00% | 0.25% | 0.01% | 0.12% | 0.00% | 53.15% | 46.28% | 0.20% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.weight | 0.00% | 7.79% | 0.00% | 2.64% | 89.53% | 0.00% | 0.04% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.weight | 0.46% | 26.01% | 0.01% | 0.17% | 0.00% | 0.00% | 73.34% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.bias | 0.00% | 0.00% | 0.00% | 57.25% | 0.06% | 42.69% | 0.00% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.ff.net.0.proj.weight | 0.00% | 0.03% | 0.00% | 99.62% | 0.10% | 0.24% | 0.00% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.ff.net.0.proj.bias | 0.00% | 0.05% | 25.20% | 19.61% | 0.07% | 0.70% | 54.38% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.ff.net.2.weight | 0.00% | 0.93% | 0.00% | 17.54% | 0.03% | 43.36% | 38.13% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.ff.net.2.bias | 0.00% | 0.00% | 0.16% | 85.35% | 3.89% | 0.00% | 10.37% | 0.24% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.weight | 0.00% | 0.10% | 0.01% | 0.01% | 0.00% | 0.00% | 99.87% | 0.02% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.weight | 0.02% | 40.32% | 0.00% | 30.69% | 0.00% | 28.51% | 0.00% | 0.46% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 0.05% | 19.42% | 80.52% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 0.37% | 0.00% | 16.59% | 0.00% | 0.00% | 3.59% | 79.45% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.bias | 0.00% | 0.00% | 0.03% | 0.00% | 0.00% | 0.00% | 99.54% | 0.43% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.norm1.weight | 0.00% | 0.00% | 0.00% | 0.00% | 56.12% | 43.87% | 0.00% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.norm1.bias | 0.00% | 0.00% | 0.09% | 0.02% | 84.75% | 0.00% | 0.00% | 15.14% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.norm2.weight | 0.00% | 0.00% | 0.01% | 0.00% | 0.00% | 0.22% | 0.31% | 99.46% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.norm2.bias | 0.00% | 0.91% | 0.00% | 0.00% | 0.00% | 96.10% | 2.99% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.norm3.weight | 0.04% | 79.24% | 16.99% | 1.37% | 0.01% | 2.35% | 0.00% | 0.00% |
| unet.down_blocks.2.attentions.1.transformer_blocks.0.norm3.bias | 0.00% | 0.10% | 0.00% | 0.01% | 2.27% | 0.05% | 0.01% | 97.56% |
| unet.down_blocks.2.attentions.1.proj_out.weight | 72.83% | 7.86% | 0.00% | 10.14% | 0.00% | 0.00% | 9.17% | 0.00% |
| unet.down_blocks.2.attentions.1.proj_out.bias | 0.00% | 2.24% | 41.11% | 16.76% | 0.00% | 1.57% | 0.76% | 37.55% |
| unet.down_blocks.2.resnets.0.norm1.weight | 0.00% | 10.57% | 0.01% | 0.00% | 0.00% | 3.74% | 85.67% | 0.00% |
| unet.down_blocks.2.resnets.0.norm1.bias | 0.01% | 31.97% | 0.00% | 0.00% | 0.00% | 66.33% | 1.70% | 0.00% |
| unet.down_blocks.2.resnets.0.conv1.weight | 0.00% | 10.49% | 0.00% | 0.02% | 0.00% | 10.13% | 0.00% | 79.36% |
| unet.down_blocks.2.resnets.0.conv1.bias | 0.00% | 6.02% | 0.00% | 0.56% | 0.00% | 93.42% | 0.00% | 0.00% |
| unet.down_blocks.2.resnets.0.time_emb_proj.weight | 0.00% | 0.02% | 64.78% | 35.14% | 0.00% | 0.06% | 0.00% | 0.00% |
| unet.down_blocks.2.resnets.0.time_emb_proj.bias | 0.00% | 50.26% | 0.00% | 0.00% | 10.48% | 4.15% | 26.48% | 8.64% |
| unet.down_blocks.2.resnets.0.norm2.weight | 19.77% | 28.78% | 0.00% | 0.00% | 18.93% | 0.00% | 32.40% | 0.11% |
| unet.down_blocks.2.resnets.0.norm2.bias | 3.80% | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 0.04% | 96.16% |
| unet.down_blocks.2.resnets.0.conv2.weight | 0.00% | 32.43% | 0.00% | 0.24% | 0.37% | 0.16% | 66.79% | 0.01% |
| unet.down_blocks.2.resnets.0.conv2.bias | 11.74% | 0.60% | 42.39% | 0.01% | 0.01% | 43.34% | 0.89% | 1.03% |
| unet.down_blocks.2.resnets.0.conv_shortcut.weight | 0.01% | 0.03% | 0.00% | 62.62% | 0.01% | 37.30% | 0.00% | 0.03% |
| unet.down_blocks.2.resnets.0.conv_shortcut.bias | 78.89% | 0.00% | 0.00% | 0.00% | 21.10% | 0.01% | 0.00% | 0.00% |
| unet.down_blocks.2.resnets.1.norm1.weight | 0.00% | 1.23% | 0.00% | 0.01% | 0.00% | 0.00% | 98.75% | 0.00% |
| unet.down_blocks.2.resnets.1.norm1.bias | 0.00% | 0.13% | 0.00% | 0.05% | 0.01% | 99.81% | 0.00% | 0.00% |
| unet.down_blocks.2.resnets.1.conv1.weight | 0.00% | 0.00% | 0.00% | 17.19% | 0.12% | 40.20% | 0.00% | 42.49% |
| unet.down_blocks.2.resnets.1.conv1.bias | 0.00% | 99.40% | 0.55% | 0.04% | 0.01% | 0.00% | 0.00% | 0.00% |
| unet.down_blocks.2.resnets.1.time_emb_proj.weight | 0.30% | 0.03% | 0.22% | 2.44% | 0.00% | 74.21% | 0.00% | 22.80% |
| unet.down_blocks.2.resnets.1.time_emb_proj.bias | 0.00% | 0.00% | 46.27% | 1.00% | 5.65% | 0.00% | 47.07% | 0.00% |
| unet.down_blocks.2.resnets.1.norm2.weight | 0.09% | 0.00% | 0.00% | 99.78% | 0.04% | 0.09% | 0.00% | 0.00% |
| unet.down_blocks.2.resnets.1.norm2.bias | 0.00% | 3.28% | 0.00% | 0.05% | 45.67% | 6.60% | 44.41% | 0.00% |
| unet.down_blocks.2.resnets.1.conv2.weight | 12.60% | 0.07% | 0.02% | 2.88% | 84.02% | 0.00% | 0.41% | 0.00% |
| unet.down_blocks.2.resnets.1.conv2.bias | 99.93% | 0.00% | 0.00% | 0.00% | 0.00% | 0.06% | 0.00% | 0.00% |
| unet.down_blocks.2.downsamplers.0.conv.weight | 0.00% | 0.00% | 1.87% | 5.33% | 0.00% | 33.78% | 59.02% | 0.00% |
| unet.down_blocks.2.downsamplers.0.conv.bias | 0.00% | 0.00% | 0.00% | 0.06% | 0.00% | 0.00% | 99.94% | 0.00% |
| unet.down_blocks.3.resnets.0.norm1.weight | 0.00% | 0.00% | 0.03% | 66.52% | 4.95% | 0.00% | 28.50% | 0.00% |
| unet.down_blocks.3.resnets.0.norm1.bias | 0.27% | 7.91% | 0.00% | 78.65% | 0.00% | 7.53% | 5.64% | 0.00% |
| unet.down_blocks.3.resnets.0.conv1.weight | 40.72% | 0.00% | 0.00% | 56.41% | 0.00% | 0.00% | 0.69% | 2.18% |
| unet.down_blocks.3.resnets.0.conv1.bias | 0.00% | 0.03% | 27.51% | 0.00% | 0.00% | 0.00% | 72.46% | 0.00% |
| unet.down_blocks.3.resnets.0.time_emb_proj.weight | 9.45% | 3.81% | 0.00% | 0.00% | 8.57% | 0.01% | 78.14% | 0.00% |
| unet.down_blocks.3.resnets.0.time_emb_proj.bias | 1.96% | 3.73% | 0.00% | 89.13% | 5.14% | 0.00% | 0.02% | 0.00% |
| unet.down_blocks.3.resnets.0.norm2.weight | 0.00% | 1.00% | 0.00% | 0.00% | 0.00% | 0.03% | 0.00% | 98.97% |
| unet.down_blocks.3.resnets.0.norm2.bias | 8.33% | 70.16% | 0.00% | 0.01% | 3.68% | 16.78% | 1.04% | 0.01% |
| unet.down_blocks.3.resnets.0.conv2.weight | 10.19% | 0.00% | 0.22% | 73.17% | 0.04% | 0.00% | 16.37% | 0.01% |
| unet.down_blocks.3.resnets.0.conv2.bias | 82.59% | 0.04% | 0.02% | 14.39% | 0.02% | 2.94% | 0.00% | 0.00% |
| unet.down_blocks.3.resnets.1.norm1.weight | 0.00% | 58.61% | 0.00% | 0.00% | 33.59% | 7.75% | 0.04% | 0.00% |
| unet.down_blocks.3.resnets.1.norm1.bias | 11.29% | 0.94% | 0.01% | 0.26% | 83.25% | 3.14% | 1.11% | 0.01% |
| unet.down_blocks.3.resnets.1.conv1.weight | 0.03% | 3.76% | 0.00% | 0.91% | 23.04% | 6.35% | 0.38% | 65.54% |
| unet.down_blocks.3.resnets.1.conv1.bias | 0.03% | 0.01% | 0.00% | 8.24% | 0.00% | 2.50% | 89.23% | 0.00% |
| unet.down_blocks.3.resnets.1.time_emb_proj.weight | 0.00% | 0.00% | 0.00% | 99.56% | 0.00% | 0.03% | 0.40% | 0.00% |
| unet.down_blocks.3.resnets.1.time_emb_proj.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.04% | 0.10% | 99.86% | 0.00% |
| unet.down_blocks.3.resnets.1.norm2.weight | 4.75% | 24.30% | 0.00% | 0.36% | 22.49% | 47.97% | 0.12% | 0.00% |
| unet.down_blocks.3.resnets.1.norm2.bias | 0.01% | 5.83% | 0.00% | 78.76% | 11.84% | 0.00% | 1.38% | 2.17% |
| unet.down_blocks.3.resnets.1.conv2.weight | 91.23% | 0.00% | 0.02% | 0.00% | 0.00% | 8.49% | 0.27% | 0.00% |
| unet.down_blocks.3.resnets.1.conv2.bias | 0.00% | 0.01% | 0.00% | 83.94% | 0.02% | 0.00% | 16.03% | 0.00% |
| unet.up_blocks.0.resnets.0.norm1.weight | 0.00% | 30.71% | 0.03% | 68.19% | 0.00% | 0.95% | 0.12% | 0.00% |
| unet.up_blocks.0.resnets.0.norm1.bias | 0.00% | 0.00% | 0.00% | 0.30% | 0.08% | 0.00% | 99.61% | 0.00% |
| unet.up_blocks.0.resnets.0.conv1.weight | 0.00% | 62.37% | 0.00% | 37.58% | 0.00% | 0.00% | 0.04% | 0.00% |
| unet.up_blocks.0.resnets.0.conv1.bias | 1.09% | 0.01% | 5.92% | 0.02% | 0.00% | 23.94% | 69.02% | 0.00% |
| unet.up_blocks.0.resnets.0.time_emb_proj.weight | 0.00% | 0.32% | 0.02% | 60.18% | 0.01% | 38.73% | 0.73% | 0.01% |
| unet.up_blocks.0.resnets.0.time_emb_proj.bias | 0.00% | 0.00% | 0.00% | 99.99% | 0.00% | 0.00% | 0.00% | 0.01% |
| unet.up_blocks.0.resnets.0.norm2.weight | 0.03% | 0.00% | 0.00% | 0.01% | 0.00% | 0.54% | 99.43% | 0.00% |
| unet.up_blocks.0.resnets.0.norm2.bias | 0.01% | 29.48% | 0.00% | 1.10% | 18.18% | 51.20% | 0.03% | 0.00% |
| unet.up_blocks.0.resnets.0.conv2.weight | 0.01% | 0.86% | 0.00% | 9.45% | 0.90% | 85.03% | 3.75% | 0.00% |
| unet.up_blocks.0.resnets.0.conv2.bias | 0.00% | 0.07% | 0.00% | 98.30% | 1.63% | 0.00% | 0.00% | 0.00% |
| unet.up_blocks.0.resnets.0.conv_shortcut.weight | 3.30% | 10.72% | 0.00% | 11.47% | 0.03% | 0.00% | 74.48% | 0.00% |
| unet.up_blocks.0.resnets.0.conv_shortcut.bias | 0.00% | 0.01% | 0.00% | 99.93% | 0.03% | 0.02% | 0.00% | 0.00% |
| unet.up_blocks.0.resnets.1.norm1.weight | 0.04% | 0.51% | 0.00% | 29.37% | 0.37% | 68.35% | 0.00% | 1.36% |
| unet.up_blocks.0.resnets.1.norm1.bias | 0.01% | 0.00% | 0.00% | 0.00% | 0.00% | 0.78% | 0.59% | 98.62% |
| unet.up_blocks.0.resnets.1.conv1.weight | 0.00% | 0.00% | 0.00% | 0.08% | 41.47% | 0.01% | 58.44% | 0.00% |
| unet.up_blocks.0.resnets.1.conv1.bias | 0.00% | 31.03% | 0.00% | 0.01% | 0.00% | 29.78% | 0.00% | 39.19% |
| unet.up_blocks.0.resnets.1.time_emb_proj.weight | 58.31% | 3.85% | 0.00% | 0.59% | 37.14% | 0.00% | 0.10% | 0.02% |
| unet.up_blocks.0.resnets.1.time_emb_proj.bias | 0.00% | 25.78% | 0.00% | 0.00% | 73.49% | 0.51% | 0.00% | 0.22% |
| unet.up_blocks.0.resnets.1.norm2.weight | 0.38% | 0.00% | 1.45% | 33.17% | 34.34% | 0.14% | 30.52% | 0.00% |
| unet.up_blocks.0.resnets.1.norm2.bias | 0.00% | 14.70% | 0.00% | 0.00% | 10.65% | 0.06% | 74.58% | 0.00% |
| unet.up_blocks.0.resnets.1.conv2.weight | 0.00% | 1.70% | 3.76% | 0.09% | 0.01% | 0.00% | 0.00% | 94.44% |
| unet.up_blocks.0.resnets.1.conv2.bias | 3.53% | 0.09% | 0.00% | 0.06% | 5.57% | 89.40% | 1.35% | 0.00% |
| unet.up_blocks.0.resnets.1.conv_shortcut.weight | 0.09% | 94.35% | 0.07% | 0.01% | 0.00% | 0.78% | 4.61% | 0.09% |
| unet.up_blocks.0.resnets.1.conv_shortcut.bias | 0.00% | 0.01% | 0.00% | 12.12% | 0.00% | 0.00% | 87.87% | 0.00% |
| unet.up_blocks.0.resnets.2.norm1.weight | 0.00% | 0.00% | 0.00% | 0.03% | 5.48% | 2.07% | 0.00% | 92.42% |
| unet.up_blocks.0.resnets.2.norm1.bias | 0.00% | 0.01% | 0.12% | 7.20% | 0.00% | 0.00% | 91.17% | 1.49% |
| unet.up_blocks.0.resnets.2.conv1.weight | 0.00% | 0.62% | 0.00% | 1.06% | 79.82% | 4.70% | 0.00% | 13.81% |
| unet.up_blocks.0.resnets.2.conv1.bias | 0.00% | 19.74% | 0.00% | 0.50% | 0.00% | 76.89% | 2.86% | 0.01% |
| unet.up_blocks.0.resnets.2.time_emb_proj.weight | 0.00% | 0.01% | 0.00% | 96.78% | 0.52% | 0.03% | 1.08% | 1.58% |
| unet.up_blocks.0.resnets.2.time_emb_proj.bias | 0.00% | 0.00% | 30.26% | 0.47% | 0.00% | 28.08% | 41.19% | 0.00% |
| unet.up_blocks.0.resnets.2.norm2.weight | 0.39% | 12.09% | 0.00% | 0.00% | 0.00% | 87.19% | 0.32% | 0.00% |
| unet.up_blocks.0.resnets.2.norm2.bias | 72.08% | 0.00% | 0.02% | 27.65% | 0.25% | 0.00% | 0.00% | 0.00% |
| unet.up_blocks.0.resnets.2.conv2.weight | 26.90% | 0.00% | 0.00% | 0.22% | 0.00% | 0.13% | 72.43% | 0.32% |
| unet.up_blocks.0.resnets.2.conv2.bias | 53.43% | 0.00% | 0.00% | 0.04% | 0.01% | 4.38% | 42.14% | 0.00% |
| unet.up_blocks.0.resnets.2.conv_shortcut.weight | 0.00% | 0.04% | 0.00% | 49.78% | 1.75% | 3.31% | 45.12% | 0.00% |
| unet.up_blocks.0.resnets.2.conv_shortcut.bias | 0.49% | 97.35% | 0.00% | 0.00% | 0.00% | 2.15% | 0.00% | 0.00% |
| unet.up_blocks.0.upsamplers.0.conv.weight | 0.02% | 0.00% | 0.00% | 10.59% | 2.25% | 0.00% | 87.09% | 0.05% |
| unet.up_blocks.0.upsamplers.0.conv.bias | 0.00% | 13.79% | 0.00% | 81.67% | 0.02% | 0.03% | 4.50% | 0.00% |
| unet.up_blocks.1.attentions.0.norm.weight | 0.00% | 2.25% | 0.00% | 5.33% | 0.00% | 92.42% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.0.norm.bias | 0.00% | 0.00% | 0.00% | 0.44% | 0.05% | 0.14% | 99.36% | 0.00% |
| unet.up_blocks.1.attentions.0.proj_in.weight | 0.00% | 0.03% | 0.00% | 38.97% | 15.84% | 1.79% | 43.37% | 0.00% |
| unet.up_blocks.1.attentions.0.proj_in.bias | 0.00% | 0.18% | 0.00% | 42.69% | 0.00% | 0.00% | 57.12% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.weight | 0.57% | 37.31% | 0.00% | 0.01% | 0.00% | 0.00% | 62.10% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.weight | 0.01% | 43.62% | 0.00% | 0.59% | 1.75% | 0.00% | 54.03% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.weight | 0.00% | 0.27% | 0.02% | 5.70% | 0.06% | 82.61% | 11.33% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.weight | 0.00% | 5.59% | 0.17% | 54.84% | 0.00% | 39.40% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.bias | 0.13% | 0.01% | 0.00% | 0.03% | 63.31% | 36.48% | 0.00% | 0.03% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.ff.net.0.proj.weight | 21.17% | 0.22% | 1.45% | 0.00% | 0.04% | 2.86% | 74.07% | 0.19% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.ff.net.0.proj.bias | 0.00% | 0.04% | 0.04% | 0.13% | 0.00% | 98.70% | 1.09% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.ff.net.2.weight | 0.00% | 0.00% | 0.00% | 14.20% | 0.88% | 0.03% | 84.90% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.ff.net.2.bias | 0.00% | 2.15% | 0.00% | 93.41% | 0.04% | 2.92% | 1.48% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.weight | 0.00% | 0.00% | 0.00% | 0.00% | 75.15% | 0.04% | 14.21% | 10.60% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.weight | 0.00% | 1.81% | 0.00% | 98.17% | 0.00% | 0.02% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.weight | 0.01% | 0.00% | 0.00% | 36.05% | 0.00% | 48.98% | 14.96% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.weight | 78.55% | 0.26% | 0.00% | 2.57% | 11.64% | 0.00% | 0.00% | 6.98% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.bias | 45.77% | 0.00% | 0.00% | 0.01% | 0.00% | 54.22% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.norm1.weight | 0.00% | 0.17% | 0.00% | 2.54% | 0.00% | 50.60% | 46.69% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.norm1.bias | 93.05% | 0.00% | 0.00% | 5.74% | 0.02% | 0.01% | 1.18% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.norm2.weight | 0.00% | 0.02% | 0.00% | 0.00% | 0.01% | 10.69% | 89.28% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.norm2.bias | 0.00% | 0.00% | 5.91% | 23.96% | 0.00% | 0.00% | 70.12% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.norm3.weight | 0.01% | 0.00% | 0.02% | 0.00% | 0.10% | 0.12% | 99.74% | 0.00% |
| unet.up_blocks.1.attentions.0.transformer_blocks.0.norm3.bias | 0.00% | 0.02% | 96.61% | 0.00% | 0.00% | 0.13% | 3.24% | 0.00% |
| unet.up_blocks.1.attentions.0.proj_out.weight | 2.21% | 0.00% | 0.00% | 95.72% | 0.00% | 2.07% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.0.proj_out.bias | 0.00% | 0.00% | 0.84% | 0.00% | 0.85% | 27.07% | 71.23% | 0.00% |
| unet.up_blocks.1.attentions.1.norm.weight | 0.04% | 0.07% | 0.00% | 0.00% | 0.00% | 64.67% | 35.21% | 0.00% |
| unet.up_blocks.1.attentions.1.norm.bias | 99.31% | 0.13% | 0.00% | 0.00% | 0.03% | 0.01% | 0.06% | 0.46% |
| unet.up_blocks.1.attentions.1.proj_in.weight | 0.00% | 0.00% | 0.00% | 0.04% | 0.00% | 54.06% | 45.88% | 0.01% |
| unet.up_blocks.1.attentions.1.proj_in.bias | 0.00% | 0.00% | 0.00% | 13.26% | 84.08% | 2.63% | 0.04% | 0.00% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.weight | 0.00% | 0.00% | 0.00% | 0.03% | 0.13% | 1.63% | 98.18% | 0.02% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.weight | 0.58% | 22.90% | 0.09% | 37.55% | 20.62% | 0.39% | 17.74% | 0.13% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.weight | 0.17% | 1.52% | 0.00% | 25.55% | 0.00% | 0.43% | 72.27% | 0.06% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.weight | 0.00% | 17.17% | 0.14% | 82.61% | 0.05% | 0.00% | 0.02% | 0.00% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.bias | 4.69% | 0.00% | 0.02% | 8.94% | 36.75% | 0.00% | 47.83% | 1.77% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.ff.net.0.proj.weight | 0.00% | 0.00% | 0.02% | 0.02% | 98.60% | 1.35% | 0.01% | 0.00% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.ff.net.0.proj.bias | 0.00% | 34.56% | 0.00% | 64.83% | 0.00% | 0.61% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.ff.net.2.weight | 0.00% | 0.00% | 0.00% | 0.02% | 99.35% | 0.00% | 0.64% | 0.00% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.ff.net.2.bias | 0.42% | 1.49% | 0.00% | 0.00% | 0.00% | 98.04% | 0.00% | 0.05% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.weight | 1.23% | 0.00% | 0.00% | 0.00% | 82.55% | 15.90% | 0.06% | 0.26% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.weight | 0.00% | 0.00% | 0.00% | 0.48% | 0.30% | 0.47% | 97.98% | 0.77% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.weight | 0.00% | 21.60% | 0.00% | 15.46% | 0.04% | 0.29% | 0.00% | 62.60% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 60.36% | 0.00% | 35.40% | 0.00% | 4.07% | 0.12% | 0.05% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.bias | 0.00% | 0.00% | 86.56% | 0.00% | 0.31% | 0.06% | 13.01% | 0.06% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.norm1.weight | 0.00% | 8.44% | 0.00% | 0.00% | 25.76% | 0.00% | 65.80% | 0.00% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.norm1.bias | 58.07% | 0.00% | 0.00% | 26.61% | 5.80% | 0.00% | 9.47% | 0.05% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.norm2.weight | 0.00% | 8.72% | 86.03% | 0.22% | 0.16% | 0.01% | 0.07% | 4.78% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.norm2.bias | 0.00% | 0.27% | 0.01% | 43.52% | 0.00% | 51.72% | 4.47% | 0.00% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.norm3.weight | 0.00% | 7.81% | 0.00% | 0.00% | 76.45% | 5.04% | 10.67% | 0.02% |
| unet.up_blocks.1.attentions.1.transformer_blocks.0.norm3.bias | 0.04% | 78.72% | 0.00% | 11.96% | 0.68% | 0.00% | 0.77% | 7.84% |
| unet.up_blocks.1.attentions.1.proj_out.weight | 0.01% | 11.59% | 0.80% | 0.00% | 87.45% | 0.14% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.1.proj_out.bias | 94.14% | 0.05% | 0.00% | 0.14% | 0.00% | 0.55% | 0.10% | 5.01% |
| unet.up_blocks.1.attentions.2.norm.weight | 59.81% | 0.00% | 0.00% | 0.00% | 4.57% | 1.82% | 33.80% | 0.00% |
| unet.up_blocks.1.attentions.2.norm.bias | 0.03% | 0.00% | 0.00% | 0.00% | 57.89% | 31.53% | 0.19% | 10.36% |
| unet.up_blocks.1.attentions.2.proj_in.weight | 0.00% | 0.00% | 15.87% | 64.15% | 0.04% | 0.00% | 1.73% | 18.21% |
| unet.up_blocks.1.attentions.2.proj_in.bias | 0.00% | 0.00% | 22.17% | 76.75% | 0.01% | 0.00% | 1.06% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.weight | 0.28% | 0.00% | 0.01% | 80.43% | 0.00% | 0.01% | 19.26% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.weight | 0.04% | 86.83% | 0.00% | 12.87% | 0.17% | 0.00% | 0.06% | 0.02% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.weight | 0.00% | 0.00% | 0.00% | 2.42% | 0.00% | 0.00% | 0.00% | 97.57% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.weight | 0.00% | 0.00% | 0.00% | 0.21% | 0.00% | 0.03% | 0.00% | 99.76% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.bias | 0.02% | 0.00% | 0.01% | 0.29% | 0.72% | 31.71% | 66.95% | 0.29% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.ff.net.0.proj.weight | 0.00% | 0.00% | 0.00% | 0.00% | 69.13% | 15.41% | 15.46% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.ff.net.0.proj.bias | 0.02% | 0.20% | 0.57% | 0.00% | 2.20% | 0.03% | 0.00% | 96.99% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.ff.net.2.weight | 0.00% | 1.00% | 0.00% | 3.34% | 0.14% | 1.99% | 93.52% | 0.02% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.ff.net.2.bias | 0.00% | 0.41% | 0.00% | 88.40% | 0.00% | 11.17% | 0.01% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.weight | 0.00% | 95.87% | 0.00% | 0.02% | 0.00% | 4.04% | 0.00% | 0.08% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.weight | 28.11% | 17.46% | 0.00% | 0.15% | 11.06% | 1.08% | 42.05% | 0.09% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.00% | 0.00% | 3.12% | 0.00% | 0.00% | 0.00% | 96.88% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 0.05% | 0.35% | 0.53% | 0.00% | 99.06% | 0.01% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.bias | 0.26% | 0.24% | 0.27% | 96.34% | 0.00% | 0.00% | 2.87% | 0.02% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.norm1.weight | 4.96% | 63.92% | 0.00% | 2.94% | 6.27% | 0.05% | 21.86% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.norm1.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 7.12% | 1.91% | 90.97% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.norm2.weight | 0.00% | 0.00% | 0.00% | 0.56% | 98.99% | 0.00% | 0.44% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.norm2.bias | 0.44% | 7.17% | 0.00% | 0.00% | 17.34% | 75.05% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.norm3.weight | 0.04% | 0.00% | 8.40% | 8.98% | 0.00% | 82.59% | 0.00% | 0.00% |
| unet.up_blocks.1.attentions.2.transformer_blocks.0.norm3.bias | 0.63% | 69.62% | 0.46% | 0.00% | 0.01% | 29.23% | 0.05% | 0.00% |
| unet.up_blocks.1.attentions.2.proj_out.weight | 0.00% | 0.00% | 0.16% | 3.98% | 0.11% | 95.72% | 0.00% | 0.03% |
| unet.up_blocks.1.attentions.2.proj_out.bias | 0.01% | 0.09% | 0.00% | 30.44% | 68.78% | 0.00% | 0.69% | 0.00% |
| unet.up_blocks.1.resnets.0.norm1.weight | 0.01% | 17.71% | 0.03% | 1.06% | 43.73% | 35.53% | 1.93% | 0.00% |
| unet.up_blocks.1.resnets.0.norm1.bias | 0.00% | 36.00% | 0.00% | 0.10% | 0.00% | 25.28% | 15.95% | 22.66% |
| unet.up_blocks.1.resnets.0.conv1.weight | 0.15% | 0.00% | 0.15% | 0.53% | 0.00% | 0.02% | 97.17% | 1.99% |
| unet.up_blocks.1.resnets.0.conv1.bias | 0.00% | 0.00% | 0.00% | 85.53% | 13.93% | 0.00% | 0.54% | 0.00% |
| unet.up_blocks.1.resnets.0.time_emb_proj.weight | 0.00% | 0.01% | 0.01% | 0.00% | 95.75% | 0.17% | 3.93% | 0.13% |
| unet.up_blocks.1.resnets.0.time_emb_proj.bias | 24.24% | 72.26% | 0.00% | 1.34% | 0.01% | 0.13% | 2.02% | 0.00% |
| unet.up_blocks.1.resnets.0.norm2.weight | 4.74% | 85.96% | 0.05% | 2.32% | 0.08% | 0.00% | 0.03% | 6.81% |
| unet.up_blocks.1.resnets.0.norm2.bias | 7.14% | 0.00% | 0.00% | 92.86% | 0.00% | 0.00% | 0.00% | 0.00% |
| unet.up_blocks.1.resnets.0.conv2.weight | 1.82% | 9.57% | 0.00% | 84.59% | 3.12% | 0.00% | 0.89% | 0.00% |
| unet.up_blocks.1.resnets.0.conv2.bias | 0.26% | 0.00% | 0.00% | 85.78% | 0.00% | 0.00% | 13.96% | 0.00% |
| unet.up_blocks.1.resnets.0.conv_shortcut.weight | 0.00% | 0.47% | 0.04% | 49.66% | 49.41% | 0.42% | 0.00% | 0.00% |
| unet.up_blocks.1.resnets.0.conv_shortcut.bias | 0.00% | 0.00% | 0.00% | 47.81% | 0.00% | 52.19% | 0.00% | 0.00% |
| unet.up_blocks.1.resnets.1.norm1.weight | 1.04% | 25.79% | 71.62% | 0.00% | 0.00% | 0.00% | 1.55% | 0.00% |
| unet.up_blocks.1.resnets.1.norm1.bias | 0.00% | 0.42% | 0.00% | 92.93% | 0.07% | 3.90% | 2.68% | 0.00% |
| unet.up_blocks.1.resnets.1.conv1.weight | 76.31% | 1.34% | 0.00% | 0.00% | 22.27% | 0.07% | 0.00% | 0.00% |
| unet.up_blocks.1.resnets.1.conv1.bias | 0.00% | 0.00% | 0.00% | 50.34% | 0.06% | 26.00% | 23.60% | 0.00% |
| unet.up_blocks.1.resnets.1.time_emb_proj.weight | 0.00% | 0.01% | 0.00% | 31.58% | 67.17% | 0.00% | 1.04% | 0.21% |
| unet.up_blocks.1.resnets.1.time_emb_proj.bias | 0.00% | 10.81% | 0.00% | 52.99% | 18.34% | 13.51% | 4.24% | 0.12% |
| unet.up_blocks.1.resnets.1.norm2.weight | 0.00% | 0.00% | 0.29% | 1.83% | 0.00% | 0.00% | 1.34% | 96.53% |
| unet.up_blocks.1.resnets.1.norm2.bias | 1.29% | 0.00% | 0.00% | 2.77% | 79.88% | 16.02% | 0.03% | 0.00% |
| unet.up_blocks.1.resnets.1.conv2.weight | 0.57% | 0.00% | 0.00% | 25.37% | 53.49% | 0.00% | 19.96% | 0.60% |
| unet.up_blocks.1.resnets.1.conv2.bias | 2.50% | 0.00% | 0.41% | 0.49% | 0.03% | 0.02% | 96.56% | 0.00% |
| unet.up_blocks.1.resnets.1.conv_shortcut.weight | 0.00% | 12.51% | 0.00% | 28.40% | 37.40% | 0.02% | 0.06% | 21.62% |
| unet.up_blocks.1.resnets.1.conv_shortcut.bias | 69.12% | 0.00% | 1.07% | 0.32% | 29.48% | 0.00% | 0.00% | 0.00% |
| unet.up_blocks.1.resnets.2.norm1.weight | 0.00% | 0.00% | 0.00% | 17.23% | 0.00% | 4.09% | 78.62% | 0.06% |
| unet.up_blocks.1.resnets.2.norm1.bias | 1.82% | 2.79% | 2.05% | 0.00% | 0.00% | 15.24% | 0.03% | 78.06% |
| unet.up_blocks.1.resnets.2.conv1.weight | 0.00% | 0.00% | 0.00% | 94.65% | 0.00% | 0.00% | 5.35% | 0.00% |
| unet.up_blocks.1.resnets.2.conv1.bias | 0.41% | 0.00% | 0.13% | 0.01% | 0.02% | 99.25% | 0.01% | 0.17% |
| unet.up_blocks.1.resnets.2.time_emb_proj.weight | 21.80% | 0.00% | 0.00% | 0.00% | 0.05% | 0.00% | 13.65% | 64.48% |
| unet.up_blocks.1.resnets.2.time_emb_proj.bias | 0.00% | 0.00% | 88.18% | 0.00% | 11.05% | 0.03% | 0.73% | 0.00% |
| unet.up_blocks.1.resnets.2.norm2.weight | 0.00% | 81.62% | 0.00% | 0.00% | 0.00% | 18.09% | 0.29% | 0.00% |
| unet.up_blocks.1.resnets.2.norm2.bias | 99.19% | 0.00% | 0.03% | 0.00% | 0.00% | 0.48% | 0.29% | 0.00% |
| unet.up_blocks.1.resnets.2.conv2.weight | 0.01% | 0.00% | 0.00% | 0.00% | 99.46% | 0.01% | 0.52% | 0.00% |
| unet.up_blocks.1.resnets.2.conv2.bias | 2.90% | 19.02% | 0.00% | 0.00% | 0.00% | 0.00% | 77.78% | 0.29% |
| unet.up_blocks.1.resnets.2.conv_shortcut.weight | 0.00% | 1.84% | 38.77% | 52.74% | 0.00% | 0.00% | 6.65% | 0.00% |
| unet.up_blocks.1.resnets.2.conv_shortcut.bias | 1.59% | 0.00% | 0.00% | 36.96% | 0.00% | 1.05% | 0.00% | 60.41% |
| unet.up_blocks.1.upsamplers.0.conv.weight | 0.00% | 9.96% | 0.01% | 1.08% | 1.78% | 84.42% | 0.00% | 2.75% |
| unet.up_blocks.1.upsamplers.0.conv.bias | 0.01% | 0.03% | 0.02% | 0.64% | 70.43% | 0.00% | 28.89% | 0.00% |
| unet.up_blocks.2.attentions.0.norm.weight | 0.00% | 0.00% | 0.00% | 0.43% | 0.00% | 21.21% | 78.36% | 0.00% |
| unet.up_blocks.2.attentions.0.norm.bias | 39.63% | 0.03% | 0.00% | 53.72% | 0.00% | 4.74% | 1.87% | 0.00% |
| unet.up_blocks.2.attentions.0.proj_in.weight | 0.00% | 16.25% | 0.12% | 37.21% | 0.00% | 0.00% | 2.62% | 43.79% |
| unet.up_blocks.2.attentions.0.proj_in.bias | 90.77% | 0.00% | 0.31% | 0.01% | 0.09% | 8.70% | 0.01% | 0.11% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.weight | 5.58% | 0.00% | 0.00% | 15.38% | 78.73% | 0.32% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.weight | 0.00% | 3.26% | 0.00% | 68.33% | 0.06% | 0.00% | 28.35% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.weight | 0.00% | 0.11% | 0.01% | 0.01% | 0.00% | 0.01% | 99.79% | 0.08% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.weight | 0.10% | 0.00% | 0.00% | 0.00% | 0.00% | 0.86% | 99.04% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.bias | 0.00% | 0.00% | 0.00% | 1.41% | 0.00% | 44.57% | 54.00% | 0.01% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.ff.net.0.proj.weight | 0.00% | 61.23% | 0.00% | 0.12% | 1.14% | 0.27% | 37.23% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.ff.net.0.proj.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 99.94% | 0.05% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.ff.net.2.weight | 0.00% | 96.28% | 0.13% | 0.01% | 0.04% | 0.47% | 1.00% | 2.08% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.ff.net.2.bias | 0.00% | 0.00% | 0.00% | 0.41% | 98.98% | 0.57% | 0.00% | 0.03% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.weight | 90.00% | 0.00% | 0.00% | 0.00% | 10.00% | 0.00% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.weight | 0.00% | 0.00% | 0.00% | 6.65% | 0.00% | 90.89% | 0.31% | 2.15% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.00% | 0.12% | 0.00% | 0.00% | 0.51% | 99.37% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.weight | 0.05% | 76.70% | 0.00% | 1.47% | 1.25% | 15.22% | 5.15% | 0.16% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.bias | 0.00% | 0.02% | 0.02% | 34.17% | 0.05% | 34.57% | 31.18% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.norm1.weight | 0.00% | 54.88% | 0.00% | 18.28% | 26.46% | 0.04% | 0.34% | 0.01% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.norm1.bias | 0.00% | 0.00% | 0.00% | 0.00% | 80.85% | 0.00% | 0.02% | 19.13% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.norm2.weight | 8.56% | 0.10% | 0.00% | 85.40% | 0.31% | 5.64% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.norm2.bias | 0.00% | 45.14% | 0.00% | 0.11% | 0.00% | 0.14% | 0.00% | 54.61% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.norm3.weight | 0.00% | 0.11% | 0.00% | 0.00% | 0.00% | 68.04% | 0.00% | 31.85% |
| unet.up_blocks.2.attentions.0.transformer_blocks.0.norm3.bias | 0.00% | 0.00% | 0.00% | 0.02% | 0.00% | 99.98% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.0.proj_out.weight | 0.00% | 0.35% | 0.00% | 0.22% | 0.16% | 45.68% | 53.57% | 0.02% |
| unet.up_blocks.2.attentions.0.proj_out.bias | 0.00% | 0.01% | 0.56% | 6.04% | 0.00% | 0.12% | 93.27% | 0.00% |
| unet.up_blocks.2.attentions.1.norm.weight | 0.00% | 72.05% | 0.00% | 18.47% | 5.85% | 0.99% | 0.00% | 2.64% |
| unet.up_blocks.2.attentions.1.norm.bias | 0.63% | 0.01% | 0.00% | 98.01% | 1.34% | 0.00% | 0.01% | 0.00% |
| unet.up_blocks.2.attentions.1.proj_in.weight | 8.17% | 10.33% | 0.27% | 0.01% | 81.21% | 0.00% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.1.proj_in.bias | 0.00% | 0.00% | 7.10% | 91.78% | 0.04% | 0.95% | 0.04% | 0.08% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.weight | 0.00% | 3.66% | 0.00% | 0.00% | 96.17% | 0.09% | 0.07% | 0.00% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.weight | 0.09% | 1.02% | 0.00% | 6.19% | 4.23% | 15.58% | 11.07% | 61.82% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.weight | 0.00% | 0.00% | 0.00% | 62.27% | 0.02% | 0.56% | 29.61% | 7.53% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.weight | 0.00% | 0.00% | 0.00% | 95.94% | 3.32% | 0.00% | 0.74% | 0.00% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.bias | 0.22% | 0.00% | 0.00% | 26.30% | 0.01% | 18.73% | 6.94% | 47.79% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.ff.net.0.proj.weight | 37.10% | 0.00% | 0.00% | 0.00% | 31.72% | 31.17% | 0.01% | 0.00% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.ff.net.0.proj.bias | 0.55% | 0.10% | 0.01% | 46.36% | 0.00% | 0.00% | 52.99% | 0.00% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.ff.net.2.weight | 8.78% | 0.00% | 0.00% | 85.72% | 0.01% | 0.18% | 0.05% | 5.26% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.ff.net.2.bias | 0.27% | 0.00% | 0.00% | 12.49% | 0.03% | 31.48% | 5.09% | 50.64% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.weight | 0.00% | 0.03% | 0.00% | 98.43% | 0.00% | 1.30% | 0.24% | 0.00% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.weight | 8.49% | 0.01% | 0.00% | 16.59% | 54.63% | 19.38% | 0.90% | 0.00% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.00% | 0.00% | 0.97% | 98.62% | 0.41% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 0.13% | 60.38% | 39.33% | 0.04% | 0.00% | 0.08% | 0.04% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.bias | 0.00% | 99.79% | 0.00% | 0.01% | 0.01% | 0.00% | 0.18% | 0.01% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.norm1.weight | 0.00% | 0.00% | 0.00% | 23.45% | 0.00% | 1.15% | 75.40% | 0.00% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.norm1.bias | 0.00% | 0.00% | 0.01% | 0.00% | 6.55% | 93.27% | 0.00% | 0.16% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.norm2.weight | 3.52% | 0.00% | 0.00% | 0.00% | 0.15% | 90.80% | 2.77% | 2.76% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.norm2.bias | 21.34% | 0.14% | 0.00% | 20.00% | 0.00% | 15.27% | 26.94% | 16.32% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.norm3.weight | 0.00% | 0.00% | 0.00% | 99.68% | 0.00% | 0.26% | 0.00% | 0.06% |
| unet.up_blocks.2.attentions.1.transformer_blocks.0.norm3.bias | 0.00% | 0.04% | 0.00% | 33.07% | 15.65% | 51.23% | 0.01% | 0.00% |
| unet.up_blocks.2.attentions.1.proj_out.weight | 1.54% | 0.00% | 0.00% | 0.11% | 0.00% | 97.14% | 0.00% | 1.20% |
| unet.up_blocks.2.attentions.1.proj_out.bias | 0.00% | 0.00% | 0.00% | 95.36% | 0.00% | 3.57% | 1.06% | 0.01% |
| unet.up_blocks.2.attentions.2.norm.weight | 0.17% | 0.00% | 0.00% | 99.82% | 0.01% | 0.00% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.2.norm.bias | 0.00% | 5.90% | 0.00% | 67.98% | 0.00% | 0.00% | 26.12% | 0.00% |
| unet.up_blocks.2.attentions.2.proj_in.weight | 2.68% | 0.08% | 0.00% | 17.69% | 0.00% | 36.86% | 42.69% | 0.00% |
| unet.up_blocks.2.attentions.2.proj_in.bias | 0.00% | 0.09% | 0.00% | 15.12% | 3.02% | 10.98% | 70.79% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.weight | 0.00% | 0.05% | 0.00% | 0.04% | 0.00% | 99.90% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.weight | 0.00% | 0.34% | 0.00% | 0.01% | 0.00% | 99.65% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.weight | 39.49% | 0.29% | 0.00% | 0.00% | 60.20% | 0.02% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.weight | 0.00% | 38.35% | 0.11% | 0.08% | 0.01% | 0.02% | 0.00% | 61.41% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.bias | 0.00% | 42.50% | 0.00% | 10.70% | 0.24% | 46.56% | 0.00% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.ff.net.0.proj.weight | 0.47% | 22.49% | 0.11% | 15.40% | 0.00% | 2.59% | 58.93% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.ff.net.0.proj.bias | 13.73% | 0.00% | 0.00% | 0.07% | 0.00% | 0.00% | 0.00% | 86.20% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.ff.net.2.weight | 77.87% | 0.05% | 0.45% | 0.00% | 21.60% | 0.00% | 0.01% | 0.01% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.ff.net.2.bias | 98.57% | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 1.43% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.weight | 0.06% | 17.44% | 1.70% | 0.01% | 7.73% | 7.53% | 31.27% | 34.27% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.weight | 0.00% | 0.00% | 0.00% | 0.01% | 4.12% | 44.23% | 51.64% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.weight | 1.22% | 0.00% | 0.00% | 0.13% | 96.50% | 0.03% | 2.11% | 0.01% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 0.22% | 0.07% | 0.00% | 5.06% | 0.05% | 0.00% | 94.60% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.bias | 0.00% | 1.35% | 0.00% | 0.67% | 97.94% | 0.00% | 0.03% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.norm1.weight | 0.07% | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 99.92% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.norm1.bias | 16.56% | 1.85% | 0.02% | 6.31% | 0.00% | 75.10% | 0.17% | 0.00% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.norm2.weight | 0.00% | 2.11% | 0.00% | 0.00% | 4.98% | 0.00% | 0.01% | 92.89% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.norm2.bias | 0.00% | 0.00% | 49.46% | 0.20% | 0.00% | 0.69% | 49.61% | 0.04% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.norm3.weight | 0.01% | 1.32% | 0.00% | 0.26% | 0.02% | 85.48% | 12.60% | 0.31% |
| unet.up_blocks.2.attentions.2.transformer_blocks.0.norm3.bias | 0.03% | 0.11% | 0.01% | 0.01% | 99.28% | 0.00% | 0.57% | 0.00% |
| unet.up_blocks.2.attentions.2.proj_out.weight | 0.00% | 0.04% | 0.01% | 0.68% | 0.01% | 98.87% | 0.39% | 0.01% |
| unet.up_blocks.2.attentions.2.proj_out.bias | 31.71% | 0.00% | 0.01% | 0.50% | 16.38% | 11.89% | 39.46% | 0.05% |
| unet.up_blocks.2.resnets.0.norm1.weight | 0.00% | 0.01% | 0.00% | 0.00% | 0.00% | 0.01% | 99.37% | 0.61% |
| unet.up_blocks.2.resnets.0.norm1.bias | 0.00% | 0.00% | 0.00% | 0.00% | 13.02% | 7.64% | 79.15% | 0.19% |
| unet.up_blocks.2.resnets.0.conv1.weight | 0.00% | 6.20% | 0.00% | 35.52% | 0.00% | 39.29% | 18.99% | 0.00% |
| unet.up_blocks.2.resnets.0.conv1.bias | 0.00% | 0.02% | 0.00% | 4.14% | 95.81% | 0.00% | 0.02% | 0.00% |
| unet.up_blocks.2.resnets.0.time_emb_proj.weight | 34.66% | 5.71% | 0.00% | 23.15% | 0.00% | 36.48% | 0.00% | 0.00% |
| unet.up_blocks.2.resnets.0.time_emb_proj.bias | 0.00% | 0.00% | 0.00% | 0.03% | 28.90% | 41.62% | 29.45% | 0.00% |
| unet.up_blocks.2.resnets.0.norm2.weight | 0.00% | 0.00% | 0.00% | 65.57% | 0.00% | 28.62% | 5.80% | 0.00% |
| unet.up_blocks.2.resnets.0.norm2.bias | 74.75% | 1.77% | 0.00% | 23.46% | 0.00% | 0.01% | 0.00% | 0.00% |
| unet.up_blocks.2.resnets.0.conv2.weight | 0.00% | 0.93% | 0.00% | 32.68% | 0.00% | 0.01% | 65.83% | 0.55% |
| unet.up_blocks.2.resnets.0.conv2.bias | 0.00% | 0.47% | 0.00% | 0.00% | 2.53% | 24.54% | 12.93% | 59.53% |
| unet.up_blocks.2.resnets.0.conv_shortcut.weight | 0.22% | 0.01% | 0.80% | 0.55% | 0.29% | 93.44% | 4.69% | 0.00% |
| unet.up_blocks.2.resnets.0.conv_shortcut.bias | 0.00% | 34.73% | 53.73% | 9.27% | 0.00% | 0.67% | 0.07% | 1.52% |
| unet.up_blocks.2.resnets.1.norm1.weight | 0.00% | 0.00% | 0.00% | 98.70% | 0.00% | 0.02% | 1.28% | 0.00% |
| unet.up_blocks.2.resnets.1.norm1.bias | 0.00% | 0.00% | 0.00% | 0.00% | 24.45% | 49.76% | 25.79% | 0.00% |
| unet.up_blocks.2.resnets.1.conv1.weight | 0.00% | 98.18% | 1.63% | 0.00% | 0.00% | 0.00% | 0.19% | 0.00% |
| unet.up_blocks.2.resnets.1.conv1.bias | 0.00% | 0.00% | 0.08% | 94.25% | 0.00% | 5.55% | 0.00% | 0.12% |
| unet.up_blocks.2.resnets.1.time_emb_proj.weight | 10.33% | 0.06% | 0.00% | 4.40% | 25.35% | 0.31% | 58.60% | 0.94% |
| unet.up_blocks.2.resnets.1.time_emb_proj.bias | 0.23% | 1.39% | 0.00% | 0.01% | 0.24% | 0.50% | 97.62% | 0.00% |
| unet.up_blocks.2.resnets.1.norm2.weight | 8.38% | 0.00% | 0.00% | 91.07% | 0.01% | 0.00% | 0.54% | 0.00% |
| unet.up_blocks.2.resnets.1.norm2.bias | 3.24% | 42.29% | 0.01% | 0.04% | 52.74% | 1.62% | 0.07% | 0.00% |
| unet.up_blocks.2.resnets.1.conv2.weight | 0.00% | 0.00% | 0.00% | 12.29% | 0.01% | 85.20% | 1.37% | 1.14% |
| unet.up_blocks.2.resnets.1.conv2.bias | 0.00% | 0.01% | 0.00% | 63.40% | 0.00% | 3.12% | 33.39% | 0.09% |
| unet.up_blocks.2.resnets.1.conv_shortcut.weight | 0.02% | 99.93% | 0.00% | 0.00% | 0.02% | 0.00% | 0.02% | 0.01% |
| unet.up_blocks.2.resnets.1.conv_shortcut.bias | 0.00% | 0.00% | 0.00% | 97.32% | 0.00% | 0.36% | 2.31% | 0.00% |
| unet.up_blocks.2.resnets.2.norm1.weight | 0.00% | 0.00% | 0.00% | 90.12% | 9.49% | 0.38% | 0.01% | 0.00% |
| unet.up_blocks.2.resnets.2.norm1.bias | 0.00% | 98.40% | 0.02% | 0.52% | 0.00% | 1.05% | 0.00% | 0.00% |
| unet.up_blocks.2.resnets.2.conv1.weight | 0.01% | 0.02% | 0.00% | 20.84% | 0.52% | 46.33% | 2.54% | 29.75% |
| unet.up_blocks.2.resnets.2.conv1.bias | 0.00% | 1.29% | 0.00% | 0.00% | 0.02% | 0.00% | 98.69% | 0.00% |
| unet.up_blocks.2.resnets.2.time_emb_proj.weight | 0.14% | 0.02% | 0.00% | 0.55% | 0.00% | 0.00% | 99.15% | 0.14% |
| unet.up_blocks.2.resnets.2.time_emb_proj.bias | 0.00% | 0.00% | 0.00% | 40.58% | 0.19% | 59.13% | 0.05% | 0.06% |
| unet.up_blocks.2.resnets.2.norm2.weight | 0.00% | 2.53% | 0.00% | 2.14% | 92.93% | 0.00% | 2.40% | 0.00% |
| unet.up_blocks.2.resnets.2.norm2.bias | 0.01% | 0.00% | 6.41% | 82.92% | 0.00% | 0.00% | 10.60% | 0.06% |
| unet.up_blocks.2.resnets.2.conv2.weight | 0.01% | 0.06% | 0.00% | 0.00% | 0.00% | 99.90% | 0.03% | 0.00% |
| unet.up_blocks.2.resnets.2.conv2.bias | 0.11% | 0.00% | 0.00% | 72.76% | 0.02% | 0.13% | 8.75% | 18.22% |
| unet.up_blocks.2.resnets.2.conv_shortcut.weight | 0.00% | 0.13% | 0.00% | 0.00% | 4.36% | 0.00% | 0.00% | 95.50% |
| unet.up_blocks.2.resnets.2.conv_shortcut.bias | 0.00% | 2.35% | 0.00% | 63.28% | 1.42% | 0.00% | 5.77% | 27.17% |
| unet.up_blocks.2.upsamplers.0.conv.weight | 0.00% | 52.63% | 0.06% | 0.00% | 0.00% | 0.02% | 0.00% | 47.28% |
| unet.up_blocks.2.upsamplers.0.conv.bias | 0.02% | 97.82% | 0.00% | 2.05% | 0.00% | 0.10% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.0.norm.weight | 0.00% | 0.00% | 0.04% | 30.83% | 0.00% | 0.19% | 68.93% | 0.00% |
| unet.up_blocks.3.attentions.0.norm.bias | 0.00% | 1.92% | 0.00% | 69.13% | 0.00% | 28.54% | 0.01% | 0.39% |
| unet.up_blocks.3.attentions.0.proj_in.weight | 0.00% | 0.00% | 0.04% | 0.02% | 0.00% | 99.94% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.0.proj_in.bias | 0.00% | 0.00% | 0.00% | 0.25% | 0.00% | 99.74% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.weight | 0.01% | 29.07% | 0.00% | 0.00% | 0.00% | 2.24% | 50.46% | 18.22% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.weight | 0.00% | 0.01% | 0.00% | 0.00% | 5.62% | 0.00% | 0.27% | 94.10% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.weight | 0.00% | 86.05% | 0.00% | 13.76% | 0.00% | 0.18% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.weight | 0.23% | 0.00% | 0.03% | 0.00% | 0.00% | 0.00% | 0.00% | 99.74% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.bias | 0.08% | 0.01% | 0.00% | 2.17% | 0.00% | 96.82% | 0.00% | 0.92% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.ff.net.0.proj.weight | 0.04% | 0.00% | 0.00% | 0.03% | 5.48% | 6.56% | 87.89% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.ff.net.0.proj.bias | 0.00% | 0.00% | 0.00% | 0.13% | 99.07% | 0.79% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.ff.net.2.weight | 0.00% | 74.38% | 0.00% | 0.00% | 25.59% | 0.00% | 0.00% | 0.02% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.ff.net.2.bias | 0.81% | 15.82% | 0.72% | 1.14% | 0.00% | 0.06% | 0.20% | 81.24% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.weight | 0.02% | 0.00% | 0.01% | 1.16% | 98.46% | 0.00% | 0.35% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.weight | 0.00% | 22.94% | 0.00% | 0.00% | 0.00% | 34.66% | 42.39% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.weight | 0.00% | 0.00% | 0.00% | 0.00% | 0.13% | 0.14% | 0.01% | 99.72% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.weight | 0.04% | 0.00% | 98.79% | 0.61% | 0.02% | 0.12% | 0.36% | 0.06% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.bias | 0.02% | 1.63% | 0.00% | 0.02% | 0.03% | 2.17% | 1.60% | 94.52% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.norm1.weight | 0.00% | 98.69% | 0.93% | 0.00% | 0.38% | 0.00% | 0.01% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.norm1.bias | 0.00% | 0.01% | 0.02% | 0.07% | 0.13% | 0.26% | 99.51% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.norm2.weight | 0.77% | 0.00% | 0.01% | 0.03% | 0.00% | 0.46% | 98.73% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.norm2.bias | 0.00% | 0.00% | 0.00% | 37.53% | 62.46% | 0.01% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.norm3.weight | 0.00% | 0.00% | 0.00% | 0.87% | 0.00% | 0.09% | 94.97% | 4.08% |
| unet.up_blocks.3.attentions.0.transformer_blocks.0.norm3.bias | 0.00% | 12.35% | 0.00% | 0.22% | 0.15% | 0.06% | 86.97% | 0.25% |
| unet.up_blocks.3.attentions.0.proj_out.weight | 0.00% | 0.01% | 0.00% | 0.00% | 0.00% | 0.12% | 99.53% | 0.34% |
| unet.up_blocks.3.attentions.0.proj_out.bias | 0.00% | 0.00% | 0.00% | 99.00% | 0.07% | 0.00% | 0.37% | 0.56% |
| unet.up_blocks.3.attentions.1.norm.weight | 0.00% | 57.62% | 0.00% | 0.00% | 22.13% | 18.23% | 1.26% | 0.76% |
| unet.up_blocks.3.attentions.1.norm.bias | 0.13% | 60.96% | 0.00% | 1.06% | 7.11% | 28.68% | 2.06% | 0.00% |
| unet.up_blocks.3.attentions.1.proj_in.weight | 0.00% | 0.02% | 0.00% | 0.15% | 72.03% | 27.13% | 0.67% | 0.00% |
| unet.up_blocks.3.attentions.1.proj_in.bias | 0.03% | 10.59% | 0.00% | 84.21% | 0.00% | 0.01% | 5.16% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.weight | 0.00% | 4.49% | 0.00% | 0.00% | 0.00% | 0.04% | 95.46% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.weight | 0.00% | 1.61% | 0.15% | 0.01% | 4.29% | 83.54% | 10.40% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.weight | 26.05% | 0.12% | 0.00% | 72.93% | 0.28% | 0.42% | 0.21% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.weight | 0.25% | 1.81% | 5.23% | 69.58% | 1.21% | 0.27% | 16.90% | 4.75% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.bias | 0.18% | 0.08% | 0.00% | 0.01% | 0.10% | 6.46% | 93.18% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.ff.net.0.proj.weight | 0.72% | 36.11% | 0.00% | 0.00% | 3.70% | 0.08% | 15.61% | 43.78% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.ff.net.0.proj.bias | 0.06% | 0.00% | 0.00% | 7.59% | 10.25% | 47.21% | 33.54% | 1.35% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.ff.net.2.weight | 0.00% | 0.00% | 0.00% | 61.40% | 33.88% | 0.15% | 1.53% | 3.05% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.ff.net.2.bias | 0.46% | 0.24% | 0.00% | 0.15% | 50.09% | 0.00% | 49.06% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.weight | 0.00% | 0.00% | 0.00% | 0.51% | 94.69% | 3.79% | 1.01% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.weight | 0.03% | 0.00% | 0.00% | 0.04% | 98.84% | 0.38% | 0.01% | 0.71% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.weight | 0.00% | 70.59% | 0.00% | 0.20% | 11.79% | 17.39% | 0.03% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.weight | 0.07% | 0.09% | 0.00% | 99.50% | 0.00% | 0.00% | 0.01% | 0.33% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.bias | 0.02% | 0.55% | 0.00% | 78.10% | 20.69% | 0.00% | 0.64% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.norm1.weight | 0.00% | 0.25% | 0.00% | 87.41% | 12.18% | 0.14% | 0.00% | 0.03% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.norm1.bias | 0.08% | 5.95% | 0.00% | 93.06% | 0.03% | 0.01% | 0.00% | 0.87% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.norm2.weight | 0.00% | 0.00% | 0.00% | 49.66% | 0.00% | 0.04% | 0.21% | 50.08% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.norm2.bias | 0.00% | 2.28% | 0.00% | 87.82% | 0.32% | 9.51% | 0.07% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.norm3.weight | 0.28% | 0.00% | 0.14% | 42.92% | 0.00% | 56.66% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.1.transformer_blocks.0.norm3.bias | 0.00% | 0.00% | 0.00% | 0.01% | 99.92% | 0.00% | 0.07% | 0.00% |
| unet.up_blocks.3.attentions.1.proj_out.weight | 0.00% | 0.00% | 0.00% | 0.69% | 29.59% | 3.06% | 66.66% | 0.00% |
| unet.up_blocks.3.attentions.1.proj_out.bias | 9.42% | 0.11% | 0.03% | 0.00% | 0.12% | 90.29% | 0.02% | 0.00% |
| unet.up_blocks.3.attentions.2.norm.weight | 3.11% | 0.00% | 0.00% | 0.03% | 7.39% | 73.86% | 0.22% | 15.38% |
| unet.up_blocks.3.attentions.2.norm.bias | 0.00% | 0.01% | 0.10% | 0.00% | 40.66% | 59.19% | 0.03% | 0.00% |
| unet.up_blocks.3.attentions.2.proj_in.weight | 0.00% | 0.00% | 0.00% | 0.12% | 0.01% | 0.00% | 99.87% | 0.00% |
| unet.up_blocks.3.attentions.2.proj_in.bias | 0.04% | 5.55% | 0.00% | 0.00% | 0.00% | 94.39% | 0.02% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.weight | 0.00% | 0.00% | 0.00% | 8.17% | 0.17% | 47.29% | 44.37% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.weight | 0.00% | 28.19% | 0.00% | 35.35% | 0.00% | 0.03% | 32.04% | 4.39% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.weight | 0.00% | 1.08% | 0.00% | 1.05% | 97.35% | 0.00% | 0.00% | 0.53% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.weight | 0.00% | 3.02% | 0.19% | 0.00% | 0.00% | 0.00% | 96.09% | 0.69% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.bias | 0.03% | 3.52% | 0.20% | 0.00% | 1.28% | 94.97% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.ff.net.0.proj.weight | 0.00% | 0.45% | 0.05% | 0.00% | 95.55% | 3.39% | 0.55% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.ff.net.0.proj.bias | 0.01% | 0.01% | 0.00% | 37.07% | 0.39% | 0.00% | 39.34% | 23.18% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.ff.net.2.weight | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 1.67% | 98.32% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.ff.net.2.bias | 0.05% | 0.25% | 0.00% | 74.07% | 0.30% | 0.29% | 25.05% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.weight | 27.16% | 0.00% | 0.00% | 1.11% | 71.72% | 0.00% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.weight | 0.03% | 37.59% | 0.01% | 25.41% | 10.18% | 0.05% | 0.05% | 26.68% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.weight | 0.02% | 0.00% | 9.83% | 3.64% | 0.00% | 0.00% | 6.66% | 79.85% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 5.52% | 0.00% | 30.17% | 1.13% | 0.03% | 63.15% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.bias | 0.00% | 0.05% | 0.00% | 0.00% | 0.00% | 66.98% | 32.97% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.norm1.weight | 0.00% | 0.59% | 97.42% | 0.01% | 0.00% | 0.11% | 1.88% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.norm1.bias | 46.55% | 17.57% | 0.00% | 5.04% | 19.38% | 11.44% | 0.01% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.norm2.weight | 0.00% | 0.07% | 0.00% | 99.66% | 0.00% | 0.28% | 0.00% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.norm2.bias | 0.04% | 2.29% | 0.24% | 3.58% | 93.81% | 0.00% | 0.04% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.norm3.weight | 0.09% | 0.00% | 0.00% | 0.96% | 0.11% | 29.64% | 69.20% | 0.00% |
| unet.up_blocks.3.attentions.2.transformer_blocks.0.norm3.bias | 0.00% | 0.00% | 0.00% | 7.91% | 90.99% | 0.22% | 0.11% | 0.76% |
| unet.up_blocks.3.attentions.2.proj_out.weight | 0.00% | 0.00% | 0.00% | 0.00% | 46.93% | 0.10% | 42.20% | 10.77% |
| unet.up_blocks.3.attentions.2.proj_out.bias | 0.00% | 0.00% | 0.00% | 0.10% | 0.00% | 0.00% | 99.90% | 0.00% |
| unet.up_blocks.3.resnets.0.norm1.weight | 0.00% | 0.00% | 0.00% | 62.90% | 0.49% | 35.64% | 0.97% | 0.00% |
| unet.up_blocks.3.resnets.0.norm1.bias | 0.53% | 12.37% | 0.00% | 0.08% | 0.04% | 86.29% | 0.14% | 0.56% |
| unet.up_blocks.3.resnets.0.conv1.weight | 0.00% | 34.09% | 65.88% | 0.00% | 0.00% | 0.00% | 0.03% | 0.00% |
| unet.up_blocks.3.resnets.0.conv1.bias | 0.00% | 0.00% | 0.02% | 54.01% | 0.04% | 0.00% | 45.93% | 0.00% |
| unet.up_blocks.3.resnets.0.time_emb_proj.weight | 0.09% | 0.03% | 0.00% | 26.89% | 72.30% | 0.00% | 0.00% | 0.69% |
| unet.up_blocks.3.resnets.0.time_emb_proj.bias | 20.36% | 76.60% | 0.00% | 0.33% | 1.24% | 0.00% | 1.47% | 0.00% |
| unet.up_blocks.3.resnets.0.norm2.weight | 0.00% | 0.68% | 0.23% | 0.00% | 0.24% | 2.24% | 84.05% | 12.55% |
| unet.up_blocks.3.resnets.0.norm2.bias | 0.00% | 0.00% | 0.00% | 26.55% | 0.00% | 50.64% | 22.81% | 0.00% |
| unet.up_blocks.3.resnets.0.conv2.weight | 0.00% | 2.00% | 0.00% | 0.00% | 0.00% | 69.45% | 0.00% | 28.55% |
| unet.up_blocks.3.resnets.0.conv2.bias | 44.47% | 0.01% | 0.06% | 20.89% | 0.01% | 1.94% | 32.62% | 0.00% |
| unet.up_blocks.3.resnets.0.conv_shortcut.weight | 0.13% | 0.49% | 0.04% | 0.21% | 0.29% | 98.84% | 0.00% | 0.00% |
| unet.up_blocks.3.resnets.0.conv_shortcut.bias | 0.00% | 0.00% | 63.81% | 0.00% | 0.00% | 36.01% | 0.18% | 0.00% |
| unet.up_blocks.3.resnets.1.norm1.weight | 0.59% | 0.38% | 0.00% | 1.47% | 0.03% | 20.06% | 77.46% | 0.00% |
| unet.up_blocks.3.resnets.1.norm1.bias | 0.01% | 0.00% | 1.13% | 1.99% | 0.00% | 0.37% | 96.50% | 0.00% |
| unet.up_blocks.3.resnets.1.conv1.weight | 1.23% | 0.00% | 0.00% | 0.63% | 61.15% | 0.00% | 36.98% | 0.00% |
| unet.up_blocks.3.resnets.1.conv1.bias | 0.01% | 0.00% | 3.76% | 0.00% | 0.14% | 0.00% | 96.08% | 0.00% |
| unet.up_blocks.3.resnets.1.time_emb_proj.weight | 2.31% | 0.00% | 0.00% | 50.93% | 0.00% | 0.00% | 0.09% | 46.66% |
| unet.up_blocks.3.resnets.1.time_emb_proj.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 11.41% | 56.86% | 31.73% |
| unet.up_blocks.3.resnets.1.norm2.weight | 99.97% | 0.00% | 0.00% | 0.00% | 0.00% | 0.02% | 0.00% | 0.00% |
| unet.up_blocks.3.resnets.1.norm2.bias | 0.00% | 25.74% | 0.00% | 26.81% | 0.03% | 4.72% | 18.31% | 24.38% |
| unet.up_blocks.3.resnets.1.conv2.weight | 3.10% | 42.46% | 0.00% | 0.97% | 0.05% | 38.53% | 14.86% | 0.03% |
| unet.up_blocks.3.resnets.1.conv2.bias | 0.10% | 1.44% | 0.00% | 30.54% | 0.90% | 0.12% | 0.09% | 66.82% |
| unet.up_blocks.3.resnets.1.conv_shortcut.weight | 0.00% | 0.00% | 0.00% | 0.18% | 0.09% | 77.12% | 22.62% | 0.00% |
| unet.up_blocks.3.resnets.1.conv_shortcut.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 100.00% | 0.00% | 0.00% |
| unet.up_blocks.3.resnets.2.norm1.weight | 98.67% | 0.00% | 0.00% | 1.31% | 0.00% | 0.00% | 0.02% | 0.00% |
| unet.up_blocks.3.resnets.2.norm1.bias | 0.00% | 0.00% | 0.00% | 5.37% | 17.34% | 13.20% | 63.12% | 0.97% |
| unet.up_blocks.3.resnets.2.conv1.weight | 0.00% | 0.08% | 0.00% | 83.63% | 14.82% | 1.46% | 0.01% | 0.00% |
| unet.up_blocks.3.resnets.2.conv1.bias | 0.00% | 0.55% | 0.00% | 0.00% | 4.69% | 9.25% | 85.50% | 0.00% |
| unet.up_blocks.3.resnets.2.time_emb_proj.weight | 0.16% | 0.00% | 0.00% | 0.00% | 55.15% | 7.07% | 37.60% | 0.02% |
| unet.up_blocks.3.resnets.2.time_emb_proj.bias | 0.00% | 0.00% | 0.00% | 54.88% | 44.93% | 0.19% | 0.00% | 0.00% |
| unet.up_blocks.3.resnets.2.norm2.weight | 0.00% | 0.00% | 0.01% | 66.34% | 0.00% | 33.64% | 0.00% | 0.00% |
| unet.up_blocks.3.resnets.2.norm2.bias | 0.01% | 2.33% | 0.02% | 61.62% | 33.67% | 0.00% | 0.36% | 2.00% |
| unet.up_blocks.3.resnets.2.conv2.weight | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 96.18% | 0.00% | 3.82% |
| unet.up_blocks.3.resnets.2.conv2.bias | 0.10% | 0.73% | 0.00% | 0.01% | 0.00% | 0.05% | 96.68% | 2.43% |
| unet.up_blocks.3.resnets.2.conv_shortcut.weight | 2.80% | 0.01% | 0.04% | 16.37% | 10.29% | 12.03% | 0.01% | 58.44% |
| unet.up_blocks.3.resnets.2.conv_shortcut.bias | 34.36% | 0.14% | 0.30% | 8.45% | 0.48% | 0.15% | 3.35% | 52.78% |
| unet.mid_block.attentions.0.norm.weight | 0.80% | 0.00% | 0.00% | 0.00% | 0.02% | 98.86% | 0.32% | 0.00% |
| unet.mid_block.attentions.0.norm.bias | 0.00% | 0.00% | 1.21% | 3.95% | 1.04% | 0.28% | 4.73% | 88.78% |
| unet.mid_block.attentions.0.proj_in.weight | 0.04% | 0.42% | 0.00% | 0.00% | 0.00% | 0.20% | 99.34% | 0.00% |
| unet.mid_block.attentions.0.proj_in.bias | 1.02% | 0.00% | 0.66% | 0.00% | 0.00% | 0.00% | 0.03% | 98.29% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.weight | 0.00% | 56.80% | 0.00% | 43.07% | 0.02% | 0.08% | 0.02% | 0.00% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.weight | 0.00% | 0.00% | 1.88% | 42.52% | 9.28% | 0.01% | 46.31% | 0.00% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.weight | 0.00% | 93.68% | 0.02% | 0.13% | 0.00% | 0.05% | 6.12% | 0.00% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.weight | 51.40% | 0.40% | 0.00% | 0.37% | 46.09% | 1.73% | 0.01% | 0.01% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.bias | 0.00% | 0.03% | 0.00% | 0.01% | 0.00% | 5.11% | 4.74% | 90.11% |
| unet.mid_block.attentions.0.transformer_blocks.0.ff.net.0.proj.weight | 28.87% | 69.97% | 0.50% | 0.36% | 0.00% | 0.00% | 0.26% | 0.02% |
| unet.mid_block.attentions.0.transformer_blocks.0.ff.net.0.proj.bias | 0.42% | 93.18% | 0.00% | 2.14% | 1.06% | 0.00% | 3.12% | 0.08% |
| unet.mid_block.attentions.0.transformer_blocks.0.ff.net.2.weight | 0.73% | 0.08% | 0.00% | 2.40% | 0.10% | 22.06% | 74.62% | 0.00% |
| unet.mid_block.attentions.0.transformer_blocks.0.ff.net.2.bias | 0.00% | 0.00% | 0.02% | 0.00% | 0.00% | 98.43% | 1.55% | 0.00% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.weight | 0.74% | 48.93% | 0.00% | 19.73% | 0.03% | 29.99% | 0.00% | 0.58% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.weight | 0.17% | 15.51% | 0.01% | 0.00% | 10.42% | 73.87% | 0.01% | 0.01% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.weight | 2.25% | 0.02% | 0.01% | 29.69% | 0.08% | 0.00% | 63.17% | 4.77% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.weight | 0.00% | 0.06% | 0.00% | 8.08% | 28.92% | 0.00% | 0.40% | 62.54% |
| unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.bias | 0.56% | 0.14% | 0.00% | 0.00% | 82.79% | 14.82% | 1.69% | 0.00% |
| unet.mid_block.attentions.0.transformer_blocks.0.norm1.weight | 0.00% | 0.02% | 0.28% | 19.89% | 0.00% | 0.11% | 0.06% | 79.64% |
| unet.mid_block.attentions.0.transformer_blocks.0.norm1.bias | 0.11% | 0.00% | 0.00% | 0.02% | 0.00% | 1.10% | 98.76% | 0.01% |
| unet.mid_block.attentions.0.transformer_blocks.0.norm2.weight | 0.13% | 99.59% | 0.00% | 0.06% | 0.00% | 0.04% | 0.00% | 0.19% |
| unet.mid_block.attentions.0.transformer_blocks.0.norm2.bias | 29.74% | 5.40% | 0.00% | 1.00% | 63.85% | 0.01% | 0.01% | 0.00% |
| unet.mid_block.attentions.0.transformer_blocks.0.norm3.weight | 54.47% | 40.33% | 0.00% | 0.00% | 3.34% | 0.05% | 1.81% | 0.00% |
| unet.mid_block.attentions.0.transformer_blocks.0.norm3.bias | 0.00% | 0.00% | 0.00% | 0.71% | 41.55% | 0.00% | 0.00% | 57.74% |
| unet.mid_block.attentions.0.proj_out.weight | 0.00% | 0.00% | 0.01% | 19.11% | 80.65% | 0.21% | 0.01% | 0.00% |
| unet.mid_block.attentions.0.proj_out.bias | 0.06% | 0.17% | 0.00% | 1.00% | 0.00% | 81.09% | 17.68% | 0.01% |
| unet.mid_block.resnets.0.norm1.weight | 0.01% | 1.80% | 0.00% | 0.00% | 0.00% | 97.99% | 0.19% | 0.00% |
| unet.mid_block.resnets.0.norm1.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.01% | 99.99% | 0.00% | 0.00% |
| unet.mid_block.resnets.0.conv1.weight | 0.00% | 9.63% | 1.05% | 49.66% | 0.00% | 0.00% | 39.66% | 0.00% |
| unet.mid_block.resnets.0.conv1.bias | 0.27% | 0.00% | 0.02% | 49.54% | 0.00% | 0.00% | 50.16% | 0.00% |
| unet.mid_block.resnets.0.time_emb_proj.weight | 0.00% | 0.00% | 0.00% | 0.00% | 15.15% | 83.55% | 1.29% | 0.01% |
| unet.mid_block.resnets.0.time_emb_proj.bias | 0.34% | 13.94% | 0.00% | 0.00% | 0.55% | 1.53% | 83.65% | 0.00% |
| unet.mid_block.resnets.0.norm2.weight | 20.33% | 79.27% | 0.00% | 0.01% | 0.17% | 0.00% | 0.22% | 0.00% |
| unet.mid_block.resnets.0.norm2.bias | 0.00% | 0.33% | 22.44% | 0.30% | 76.88% | 0.05% | 0.00% | 0.00% |
| unet.mid_block.resnets.0.conv2.weight | 0.00% | 0.01% | 0.00% | 43.00% | 0.00% | 0.28% | 11.75% | 44.95% |
| unet.mid_block.resnets.0.conv2.bias | 0.00% | 0.00% | 0.12% | 27.71% | 36.52% | 4.14% | 31.01% | 0.50% |
| unet.mid_block.resnets.1.norm1.weight | 0.00% | 4.55% | 1.79% | 11.38% | 0.00% | 82.28% | 0.00% | 0.00% |
| unet.mid_block.resnets.1.norm1.bias | 0.00% | 0.00% | 0.00% | 0.48% | 0.04% | 99.47% | 0.00% | 0.00% |
| unet.mid_block.resnets.1.conv1.weight | 0.04% | 95.67% | 0.00% | 3.23% | 0.12% | 0.70% | 0.00% | 0.25% |
| unet.mid_block.resnets.1.conv1.bias | 0.00% | 0.20% | 0.09% | 0.01% | 83.46% | 1.40% | 2.94% | 11.91% |
| unet.mid_block.resnets.1.time_emb_proj.weight | 0.00% | 44.34% | 18.19% | 0.03% | 0.00% | 0.02% | 1.88% | 35.55% |
| unet.mid_block.resnets.1.time_emb_proj.bias | 0.00% | 1.31% | 0.00% | 72.02% | 0.00% | 0.05% | 26.61% | 0.00% |
| unet.mid_block.resnets.1.norm2.weight | 0.01% | 94.65% | 0.00% | 3.11% | 0.32% | 0.00% | 0.00% | 1.91% |
| unet.mid_block.resnets.1.norm2.bias | 0.00% | 42.45% | 0.00% | 0.00% | 0.00% | 57.26% | 0.29% | 0.00% |
| unet.mid_block.resnets.1.conv2.weight | 98.99% | 0.00% | 0.00% | 0.49% | 0.51% | 0.00% | 0.01% | 0.00% |
| unet.mid_block.resnets.1.conv2.bias | 0.00% | 0.00% | 0.00% | 56.04% | 14.86% | 0.00% | 29.09% | 0.00% |
| unet.conv_norm_out.weight | 0.00% | 2.13% | 0.00% | 0.00% | 0.00% | 10.57% | 87.30% | 0.00% |
| unet.conv_norm_out.bias | 0.00% | 0.08% | 0.00% | 40.88% | 2.58% | 0.00% | 56.41% | 0.05% |
| unet.conv_out.weight | 0.13% | 0.76% | 0.00% | 0.00% | 87.43% | 10.98% | 0.70% | 0.00% |
| unet.conv_out.bias | 1.51% | 98.25% | 0.00% | 0.00% | 0.00% | 0.23% | 0.00% | 0.00% |
| text_encoder | RMHF - 2.5D-V2 | RMHF - AnimeV1 | MeinaPastel - V6 | MeinaMix - V10 | CuteYukiMix - EchoDimension | ToonYou - Beta5Unstable | RealCartoon-Anime - V3 | Fantexi - V0.9 Beta |
| - | - | - | - | - | - | - | - | - |
| text_encoder.text_model.embeddings.position_ids | 0.00% | 0.00% | 49.08% | 0.35% | 0.00% | 0.75% | 49.81% | 0.00% |
| text_encoder.text_model.embeddings.token_embedding.weight | 9.13% | 8.82% | 0.00% | 0.02% | 0.30% | 0.00% | 0.02% | 81.71% |
| text_encoder.text_model.embeddings.position_embedding.weight | 0.00% | 24.08% | 75.09% | 0.00% | 0.00% | 0.82% | 0.00% | 0.01% |
| text_encoder.text_model.encoder.layers.0.self_attn.k_proj.weight | 0.00% | 0.00% | 0.00% | 8.58% | 1.10% | 86.16% | 4.16% | 0.00% |
| text_encoder.text_model.encoder.layers.0.self_attn.k_proj.bias | 0.00% | 80.72% | 9.42% | 0.00% | 0.36% | 0.39% | 2.10% | 7.02% |
| text_encoder.text_model.encoder.layers.0.self_attn.v_proj.weight | 0.00% | 0.00% | 0.00% | 98.50% | 0.00% | 1.45% | 0.00% | 0.05% |
| text_encoder.text_model.encoder.layers.0.self_attn.v_proj.bias | 0.30% | 0.00% | 3.32% | 0.52% | 5.38% | 88.79% | 0.10% | 1.60% |
| text_encoder.text_model.encoder.layers.0.self_attn.q_proj.weight | 1.53% | 0.00% | 0.00% | 0.00% | 49.16% | 0.00% | 49.29% | 0.02% |
| text_encoder.text_model.encoder.layers.0.self_attn.q_proj.bias | 16.66% | 0.00% | 0.00% | 0.01% | 0.14% | 11.40% | 0.00% | 71.79% |
| text_encoder.text_model.encoder.layers.0.self_attn.out_proj.weight | 0.79% | 0.24% | 22.64% | 61.58% | 12.06% | 0.05% | 2.62% | 0.03% |
| text_encoder.text_model.encoder.layers.0.self_attn.out_proj.bias | 0.17% | 99.42% | 0.10% | 0.20% | 0.02% | 0.00% | 0.00% | 0.09% |
| text_encoder.text_model.encoder.layers.0.layer_norm1.weight | 0.00% | 0.00% | 0.00% | 0.01% | 7.84% | 92.14% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.0.layer_norm1.bias | 5.07% | 0.00% | 0.00% | 47.72% | 47.20% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.0.mlp.fc1.weight | 0.00% | 0.00% | 0.00% | 41.69% | 0.00% | 0.91% | 57.40% | 0.00% |
| text_encoder.text_model.encoder.layers.0.mlp.fc1.bias | 0.37% | 0.00% | 0.00% | 91.34% | 1.87% | 0.00% | 0.34% | 6.07% |
| text_encoder.text_model.encoder.layers.0.mlp.fc2.weight | 0.00% | 0.00% | 66.43% | 0.04% | 0.10% | 33.21% | 0.21% | 0.01% |
| text_encoder.text_model.encoder.layers.0.mlp.fc2.bias | 0.00% | 0.53% | 0.00% | 0.10% | 0.00% | 0.14% | 99.23% | 0.00% |
| text_encoder.text_model.encoder.layers.0.layer_norm2.weight | 0.00% | 0.01% | 0.00% | 0.70% | 0.98% | 98.29% | 0.00% | 0.02% |
| text_encoder.text_model.encoder.layers.0.layer_norm2.bias | 2.10% | 49.75% | 0.00% | 2.73% | 0.00% | 0.00% | 41.69% | 3.72% |
| text_encoder.text_model.encoder.layers.1.self_attn.k_proj.weight | 0.00% | 0.00% | 0.00% | 0.00% | 99.20% | 0.00% | 0.07% | 0.73% |
| text_encoder.text_model.encoder.layers.1.self_attn.k_proj.bias | 0.00% | 0.07% | 2.20% | 0.00% | 0.00% | 0.00% | 97.72% | 0.00% |
| text_encoder.text_model.encoder.layers.1.self_attn.v_proj.weight | 0.00% | 0.01% | 0.26% | 0.01% | 6.90% | 0.00% | 92.82% | 0.00% |
| text_encoder.text_model.encoder.layers.1.self_attn.v_proj.bias | 0.00% | 0.00% | 0.06% | 0.00% | 2.53% | 23.45% | 33.77% | 40.19% |
| text_encoder.text_model.encoder.layers.1.self_attn.q_proj.weight | 0.00% | 33.16% | 0.07% | 0.23% | 57.89% | 1.02% | 6.09% | 1.54% |
| text_encoder.text_model.encoder.layers.1.self_attn.q_proj.bias | 2.32% | 3.65% | 0.06% | 19.28% | 4.90% | 6.55% | 0.00% | 63.23% |
| text_encoder.text_model.encoder.layers.1.self_attn.out_proj.weight | 0.00% | 0.00% | 85.47% | 0.00% | 13.92% | 0.32% | 0.00% | 0.27% |
| text_encoder.text_model.encoder.layers.1.self_attn.out_proj.bias | 0.00% | 0.02% | 0.02% | 0.98% | 98.98% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.1.layer_norm1.weight | 0.00% | 0.06% | 0.00% | 0.00% | 0.00% | 99.94% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.1.layer_norm1.bias | 0.00% | 2.38% | 0.00% | 0.00% | 0.00% | 4.53% | 92.87% | 0.22% |
| text_encoder.text_model.encoder.layers.1.mlp.fc1.weight | 0.04% | 10.29% | 35.21% | 0.00% | 54.36% | 0.03% | 0.00% | 0.07% |
| text_encoder.text_model.encoder.layers.1.mlp.fc1.bias | 0.00% | 0.03% | 0.39% | 32.55% | 0.00% | 0.00% | 0.06% | 66.96% |
| text_encoder.text_model.encoder.layers.1.mlp.fc2.weight | 0.06% | 0.00% | 0.00% | 0.65% | 97.61% | 0.00% | 0.80% | 0.87% |
| text_encoder.text_model.encoder.layers.1.mlp.fc2.bias | 0.00% | 0.01% | 1.02% | 0.00% | 98.97% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.1.layer_norm2.weight | 0.00% | 4.46% | 1.16% | 4.83% | 89.24% | 0.29% | 0.02% | 0.00% |
| text_encoder.text_model.encoder.layers.1.layer_norm2.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.06% | 0.00% | 41.03% | 58.91% |
| text_encoder.text_model.encoder.layers.2.self_attn.k_proj.weight | 0.00% | 0.01% | 8.58% | 2.77% | 26.03% | 0.00% | 51.36% | 11.25% |
| text_encoder.text_model.encoder.layers.2.self_attn.k_proj.bias | 1.29% | 0.00% | 0.00% | 0.00% | 92.25% | 4.74% | 0.31% | 1.41% |
| text_encoder.text_model.encoder.layers.2.self_attn.v_proj.weight | 0.00% | 0.00% | 0.00% | 52.62% | 0.01% | 0.00% | 47.34% | 0.03% |
| text_encoder.text_model.encoder.layers.2.self_attn.v_proj.bias | 0.03% | 3.18% | 0.11% | 0.02% | 70.38% | 7.80% | 11.60% | 6.88% |
| text_encoder.text_model.encoder.layers.2.self_attn.q_proj.weight | 0.00% | 0.00% | 0.00% | 0.00% | 100.00% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.2.self_attn.q_proj.bias | 0.00% | 0.04% | 0.00% | 4.16% | 95.72% | 0.07% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.2.self_attn.out_proj.weight | 0.59% | 0.00% | 6.90% | 92.50% | 0.00% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.2.self_attn.out_proj.bias | 0.16% | 0.00% | 0.00% | 0.29% | 99.34% | 0.21% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.2.layer_norm1.weight | 0.01% | 0.00% | 0.01% | 0.03% | 0.00% | 0.00% | 0.00% | 99.96% |
| text_encoder.text_model.encoder.layers.2.layer_norm1.bias | 22.68% | 2.20% | 59.26% | 4.48% | 0.62% | 10.75% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.2.mlp.fc1.weight | 0.00% | 0.09% | 0.00% | 23.38% | 0.00% | 2.11% | 0.00% | 74.42% |
| text_encoder.text_model.encoder.layers.2.mlp.fc1.bias | 1.99% | 0.00% | 42.92% | 0.01% | 55.07% | 0.00% | 0.00% | 0.01% |
| text_encoder.text_model.encoder.layers.2.mlp.fc2.weight | 0.05% | 0.00% | 30.41% | 0.02% | 53.91% | 4.10% | 11.52% | 0.00% |
| text_encoder.text_model.encoder.layers.2.mlp.fc2.bias | 0.00% | 0.32% | 17.03% | 0.00% | 0.00% | 1.69% | 0.00% | 80.95% |
| text_encoder.text_model.encoder.layers.2.layer_norm2.weight | 1.65% | 14.65% | 0.00% | 79.67% | 3.99% | 0.00% | 0.02% | 0.02% |
| text_encoder.text_model.encoder.layers.2.layer_norm2.bias | 2.89% | 41.27% | 55.77% | 0.01% | 0.05% | 0.00% | 0.00% | 0.01% |
| text_encoder.text_model.encoder.layers.3.self_attn.k_proj.weight | 0.14% | 0.06% | 0.00% | 0.00% | 9.80% | 45.53% | 4.78% | 39.68% |
| text_encoder.text_model.encoder.layers.3.self_attn.k_proj.bias | 0.00% | 5.69% | 0.00% | 0.00% | 0.00% | 94.28% | 0.03% | 0.00% |
| text_encoder.text_model.encoder.layers.3.self_attn.v_proj.weight | 4.81% | 0.69% | 0.00% | 83.10% | 0.60% | 10.80% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.3.self_attn.v_proj.bias | 0.00% | 0.12% | 0.00% | 0.05% | 99.83% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.3.self_attn.q_proj.weight | 0.00% | 0.00% | 8.58% | 0.02% | 88.29% | 0.00% | 3.10% | 0.01% |
| text_encoder.text_model.encoder.layers.3.self_attn.q_proj.bias | 0.00% | 0.00% | 19.89% | 10.61% | 0.28% | 0.00% | 0.00% | 69.22% |
| text_encoder.text_model.encoder.layers.3.self_attn.out_proj.weight | 34.00% | 0.70% | 0.00% | 1.15% | 0.02% | 3.78% | 0.00% | 60.34% |
| text_encoder.text_model.encoder.layers.3.self_attn.out_proj.bias | 0.00% | 0.28% | 0.00% | 0.00% | 0.02% | 99.69% | 0.00% | 0.01% |
| text_encoder.text_model.encoder.layers.3.layer_norm1.weight | 0.00% | 0.00% | 0.22% | 0.01% | 94.75% | 2.48% | 0.00% | 2.55% |
| text_encoder.text_model.encoder.layers.3.layer_norm1.bias | 0.00% | 0.00% | 0.00% | 49.47% | 0.00% | 50.53% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.3.mlp.fc1.weight | 0.00% | 0.01% | 0.01% | 5.84% | 0.03% | 0.62% | 93.49% | 0.00% |
| text_encoder.text_model.encoder.layers.3.mlp.fc1.bias | 0.60% | 4.55% | 0.00% | 0.01% | 94.81% | 0.00% | 0.00% | 0.02% |
| text_encoder.text_model.encoder.layers.3.mlp.fc2.weight | 0.35% | 1.85% | 73.91% | 11.65% | 1.51% | 1.27% | 0.05% | 9.41% |
| text_encoder.text_model.encoder.layers.3.mlp.fc2.bias | 0.03% | 50.31% | 0.05% | 0.25% | 11.21% | 0.91% | 37.24% | 0.00% |
| text_encoder.text_model.encoder.layers.3.layer_norm2.weight | 79.98% | 0.01% | 18.83% | 0.51% | 0.59% | 0.00% | 0.08% | 0.00% |
| text_encoder.text_model.encoder.layers.3.layer_norm2.bias | 0.01% | 0.00% | 0.00% | 0.00% | 5.00% | 0.00% | 76.85% | 18.13% |
| text_encoder.text_model.encoder.layers.4.self_attn.k_proj.weight | 0.00% | 0.00% | 0.00% | 0.19% | 0.06% | 0.00% | 0.00% | 99.74% |
| text_encoder.text_model.encoder.layers.4.self_attn.k_proj.bias | 0.01% | 0.07% | 0.90% | 95.25% | 3.26% | 0.03% | 0.01% | 0.48% |
| text_encoder.text_model.encoder.layers.4.self_attn.v_proj.weight | 0.00% | 0.00% | 13.13% | 45.22% | 0.00% | 32.33% | 0.00% | 9.31% |
| text_encoder.text_model.encoder.layers.4.self_attn.v_proj.bias | 0.00% | 0.00% | 0.02% | 0.00% | 0.00% | 0.01% | 99.86% | 0.11% |
| text_encoder.text_model.encoder.layers.4.self_attn.q_proj.weight | 0.00% | 0.00% | 16.36% | 2.01% | 78.62% | 2.89% | 0.03% | 0.11% |
| text_encoder.text_model.encoder.layers.4.self_attn.q_proj.bias | 27.86% | 0.00% | 0.00% | 49.21% | 22.93% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.4.self_attn.out_proj.weight | 9.23% | 0.00% | 0.00% | 0.09% | 89.20% | 1.26% | 0.20% | 0.00% |
| text_encoder.text_model.encoder.layers.4.self_attn.out_proj.bias | 0.00% | 0.08% | 76.59% | 0.00% | 0.00% | 23.32% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.4.layer_norm1.weight | 0.00% | 63.72% | 0.00% | 15.79% | 0.00% | 19.76% | 0.72% | 0.00% |
| text_encoder.text_model.encoder.layers.4.layer_norm1.bias | 0.94% | 0.01% | 0.52% | 0.00% | 0.01% | 0.45% | 98.06% | 0.00% |
| text_encoder.text_model.encoder.layers.4.mlp.fc1.weight | 0.02% | 3.22% | 17.53% | 0.00% | 79.24% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.4.mlp.fc1.bias | 0.18% | 0.07% | 0.00% | 4.58% | 95.16% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.4.mlp.fc2.weight | 0.00% | 63.46% | 0.00% | 0.05% | 0.05% | 36.42% | 0.02% | 0.00% |
| text_encoder.text_model.encoder.layers.4.mlp.fc2.bias | 0.00% | 0.00% | 0.52% | 6.43% | 0.00% | 0.00% | 0.00% | 93.05% |
| text_encoder.text_model.encoder.layers.4.layer_norm2.weight | 76.10% | 0.00% | 18.60% | 5.22% | 0.07% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.4.layer_norm2.bias | 0.00% | 0.29% | 16.95% | 2.73% | 0.00% | 80.00% | 0.00% | 0.01% |
| text_encoder.text_model.encoder.layers.5.self_attn.k_proj.weight | 0.00% | 0.00% | 0.07% | 0.05% | 0.00% | 99.88% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.5.self_attn.k_proj.bias | 54.21% | 0.00% | 0.10% | 3.40% | 22.05% | 20.24% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.5.self_attn.v_proj.weight | 0.00% | 0.00% | 0.00% | 0.00% | 98.35% | 1.28% | 0.37% | 0.00% |
| text_encoder.text_model.encoder.layers.5.self_attn.v_proj.bias | 0.08% | 25.55% | 0.00% | 2.86% | 5.21% | 65.59% | 0.03% | 0.69% |
| text_encoder.text_model.encoder.layers.5.self_attn.q_proj.weight | 0.00% | 0.00% | 0.00% | 1.94% | 0.11% | 23.94% | 0.00% | 74.02% |
| text_encoder.text_model.encoder.layers.5.self_attn.q_proj.bias | 98.94% | 0.00% | 0.01% | 0.00% | 0.00% | 0.08% | 0.97% | 0.00% |
| text_encoder.text_model.encoder.layers.5.self_attn.out_proj.weight | 0.00% | 0.00% | 0.00% | 0.00% | 99.95% | 0.05% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.5.self_attn.out_proj.bias | 0.00% | 58.19% | 0.00% | 41.81% | 0.00% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.5.layer_norm1.weight | 5.50% | 0.70% | 0.17% | 13.79% | 43.67% | 36.15% | 0.01% | 0.01% |
| text_encoder.text_model.encoder.layers.5.layer_norm1.bias | 0.03% | 0.01% | 0.00% | 0.02% | 10.74% | 0.01% | 1.03% | 88.17% |
| text_encoder.text_model.encoder.layers.5.mlp.fc1.weight | 0.17% | 28.68% | 0.01% | 5.25% | 0.10% | 0.01% | 65.77% | 0.00% |
| text_encoder.text_model.encoder.layers.5.mlp.fc1.bias | 0.21% | 0.00% | 0.86% | 87.32% | 0.00% | 10.37% | 1.13% | 0.11% |
| text_encoder.text_model.encoder.layers.5.mlp.fc2.weight | 0.00% | 0.00% | 0.00% | 98.76% | 0.00% | 0.00% | 1.24% | 0.00% |
| text_encoder.text_model.encoder.layers.5.mlp.fc2.bias | 0.00% | 52.98% | 0.02% | 0.40% | 24.44% | 0.02% | 22.13% | 0.00% |
| text_encoder.text_model.encoder.layers.5.layer_norm2.weight | 0.00% | 0.00% | 0.00% | 0.00% | 7.57% | 0.00% | 51.70% | 40.72% |
| text_encoder.text_model.encoder.layers.5.layer_norm2.bias | 0.00% | 1.11% | 0.00% | 52.67% | 0.01% | 42.62% | 0.05% | 3.53% |
| text_encoder.text_model.encoder.layers.6.self_attn.k_proj.weight | 2.86% | 1.98% | 32.59% | 0.00% | 0.00% | 62.57% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.6.self_attn.k_proj.bias | 0.02% | 70.31% | 0.06% | 0.44% | 12.69% | 0.00% | 0.00% | 16.47% |
| text_encoder.text_model.encoder.layers.6.self_attn.v_proj.weight | 39.61% | 3.40% | 0.00% | 0.02% | 55.84% | 0.00% | 0.65% | 0.49% |
| text_encoder.text_model.encoder.layers.6.self_attn.v_proj.bias | 0.00% | 0.00% | 0.06% | 0.93% | 0.01% | 99.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.6.self_attn.q_proj.weight | 10.52% | 85.33% | 0.00% | 0.20% | 0.10% | 0.04% | 3.81% | 0.00% |
| text_encoder.text_model.encoder.layers.6.self_attn.q_proj.bias | 87.22% | 0.00% | 0.36% | 2.75% | 2.21% | 3.13% | 0.00% | 4.34% |
| text_encoder.text_model.encoder.layers.6.self_attn.out_proj.weight | 0.43% | 7.93% | 0.00% | 2.51% | 45.68% | 0.04% | 0.00% | 43.41% |
| text_encoder.text_model.encoder.layers.6.self_attn.out_proj.bias | 0.04% | 0.01% | 0.00% | 4.58% | 1.44% | 93.94% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.6.layer_norm1.weight | 0.04% | 0.00% | 0.07% | 91.66% | 8.06% | 0.15% | 0.01% | 0.00% |
| text_encoder.text_model.encoder.layers.6.layer_norm1.bias | 0.00% | 0.00% | 0.00% | 0.03% | 97.95% | 1.22% | 0.34% | 0.46% |
| text_encoder.text_model.encoder.layers.6.mlp.fc1.weight | 0.74% | 0.00% | 0.03% | 0.00% | 77.10% | 19.61% | 2.17% | 0.34% |
| text_encoder.text_model.encoder.layers.6.mlp.fc1.bias | 0.00% | 0.02% | 33.58% | 0.00% | 59.32% | 0.00% | 7.07% | 0.00% |
| text_encoder.text_model.encoder.layers.6.mlp.fc2.weight | 0.02% | 0.00% | 0.00% | 99.78% | 0.00% | 0.18% | 0.02% | 0.00% |
| text_encoder.text_model.encoder.layers.6.mlp.fc2.bias | 0.08% | 0.05% | 0.05% | 5.11% | 0.08% | 94.63% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.6.layer_norm2.weight | 0.05% | 0.00% | 0.00% | 0.27% | 2.66% | 97.01% | 0.01% | 0.00% |
| text_encoder.text_model.encoder.layers.6.layer_norm2.bias | 0.00% | 0.03% | 0.06% | 0.02% | 99.88% | 0.01% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.7.self_attn.k_proj.weight | 0.00% | 68.35% | 0.00% | 0.00% | 31.64% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.7.self_attn.k_proj.bias | 0.00% | 0.01% | 80.17% | 0.00% | 1.00% | 0.01% | 18.82% | 0.00% |
| text_encoder.text_model.encoder.layers.7.self_attn.v_proj.weight | 0.00% | 0.00% | 33.71% | 1.73% | 8.87% | 21.86% | 0.01% | 33.81% |
| text_encoder.text_model.encoder.layers.7.self_attn.v_proj.bias | 0.00% | 20.99% | 0.04% | 3.59% | 0.00% | 75.29% | 0.08% | 0.01% |
| text_encoder.text_model.encoder.layers.7.self_attn.q_proj.weight | 2.14% | 0.00% | 0.00% | 9.77% | 88.08% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.7.self_attn.q_proj.bias | 0.56% | 0.00% | 0.00% | 93.38% | 1.06% | 3.77% | 0.02% | 1.20% |
| text_encoder.text_model.encoder.layers.7.self_attn.out_proj.weight | 0.00% | 0.00% | 0.02% | 0.04% | 98.71% | 0.01% | 0.00% | 1.22% |
| text_encoder.text_model.encoder.layers.7.self_attn.out_proj.bias | 0.05% | 85.45% | 6.15% | 0.00% | 0.00% | 2.25% | 0.00% | 6.11% |
| text_encoder.text_model.encoder.layers.7.layer_norm1.weight | 0.00% | 0.00% | 2.50% | 0.23% | 29.41% | 0.01% | 63.49% | 4.36% |
| text_encoder.text_model.encoder.layers.7.layer_norm1.bias | 0.00% | 0.56% | 0.00% | 6.02% | 3.02% | 0.00% | 90.40% | 0.00% |
| text_encoder.text_model.encoder.layers.7.mlp.fc1.weight | 0.00% | 0.01% | 10.12% | 53.98% | 1.26% | 34.62% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.7.mlp.fc1.bias | 0.00% | 0.03% | 0.00% | 0.03% | 40.88% | 0.00% | 22.02% | 37.04% |
| text_encoder.text_model.encoder.layers.7.mlp.fc2.weight | 92.47% | 0.00% | 0.01% | 7.03% | 0.00% | 0.12% | 0.00% | 0.37% |
| text_encoder.text_model.encoder.layers.7.mlp.fc2.bias | 0.00% | 0.24% | 0.63% | 39.08% | 0.00% | 37.37% | 22.63% | 0.05% |
| text_encoder.text_model.encoder.layers.7.layer_norm2.weight | 14.49% | 1.38% | 0.00% | 74.18% | 0.00% | 9.69% | 0.25% | 0.02% |
| text_encoder.text_model.encoder.layers.7.layer_norm2.bias | 0.60% | 82.34% | 0.00% | 0.36% | 2.72% | 5.95% | 7.93% | 0.10% |
| text_encoder.text_model.encoder.layers.8.self_attn.k_proj.weight | 0.00% | 90.49% | 7.24% | 0.00% | 0.43% | 1.83% | 0.01% | 0.00% |
| text_encoder.text_model.encoder.layers.8.self_attn.k_proj.bias | 0.00% | 0.00% | 50.72% | 46.64% | 0.00% | 2.07% | 0.57% | 0.00% |
| text_encoder.text_model.encoder.layers.8.self_attn.v_proj.weight | 0.02% | 0.00% | 0.03% | 0.00% | 2.10% | 90.94% | 0.07% | 6.84% |
| text_encoder.text_model.encoder.layers.8.self_attn.v_proj.bias | 0.00% | 5.36% | 0.00% | 0.01% | 25.04% | 69.29% | 0.07% | 0.23% |
| text_encoder.text_model.encoder.layers.8.self_attn.q_proj.weight | 94.49% | 0.14% | 0.11% | 0.66% | 3.39% | 1.19% | 0.00% | 0.03% |
| text_encoder.text_model.encoder.layers.8.self_attn.q_proj.bias | 0.00% | 0.01% | 0.00% | 0.62% | 0.83% | 15.87% | 73.47% | 9.21% |
| text_encoder.text_model.encoder.layers.8.self_attn.out_proj.weight | 2.38% | 37.60% | 3.46% | 13.60% | 0.00% | 0.02% | 42.95% | 0.00% |
| text_encoder.text_model.encoder.layers.8.self_attn.out_proj.bias | 75.31% | 0.00% | 10.99% | 12.10% | 1.47% | 0.14% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.8.layer_norm1.weight | 0.32% | 0.00% | 0.00% | 64.15% | 35.50% | 0.00% | 0.00% | 0.03% |
| text_encoder.text_model.encoder.layers.8.layer_norm1.bias | 88.12% | 2.45% | 0.00% | 1.01% | 0.05% | 0.28% | 0.01% | 8.10% |
| text_encoder.text_model.encoder.layers.8.mlp.fc1.weight | 0.00% | 8.44% | 15.62% | 0.00% | 0.00% | 0.64% | 0.08% | 75.22% |
| text_encoder.text_model.encoder.layers.8.mlp.fc1.bias | 0.00% | 0.00% | 0.62% | 0.00% | 80.61% | 18.77% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.8.mlp.fc2.weight | 1.73% | 0.75% | 0.14% | 0.00% | 2.81% | 94.52% | 0.06% | 0.00% |
| text_encoder.text_model.encoder.layers.8.mlp.fc2.bias | 0.00% | 0.00% | 4.09% | 0.39% | 58.75% | 0.91% | 0.00% | 35.86% |
| text_encoder.text_model.encoder.layers.8.layer_norm2.weight | 0.00% | 2.72% | 2.53% | 0.00% | 0.02% | 8.67% | 0.01% | 86.05% |
| text_encoder.text_model.encoder.layers.8.layer_norm2.bias | 34.17% | 0.04% | 0.00% | 0.17% | 0.00% | 65.58% | 0.00% | 0.05% |
| text_encoder.text_model.encoder.layers.9.self_attn.k_proj.weight | 0.04% | 0.00% | 4.08% | 0.00% | 12.27% | 5.97% | 0.00% | 77.64% |
| text_encoder.text_model.encoder.layers.9.self_attn.k_proj.bias | 0.00% | 0.09% | 0.00% | 0.00% | 0.03% | 99.87% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.9.self_attn.v_proj.weight | 0.00% | 0.04% | 0.00% | 0.00% | 65.63% | 34.33% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.9.self_attn.v_proj.bias | 0.36% | 0.05% | 0.00% | 13.07% | 49.20% | 0.00% | 0.00% | 37.32% |
| text_encoder.text_model.encoder.layers.9.self_attn.q_proj.weight | 0.00% | 6.41% | 1.60% | 1.58% | 39.45% | 50.97% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.9.self_attn.q_proj.bias | 0.00% | 0.00% | 0.00% | 0.00% | 75.25% | 16.27% | 0.00% | 8.48% |
| text_encoder.text_model.encoder.layers.9.self_attn.out_proj.weight | 0.00% | 10.63% | 0.00% | 0.00% | 24.98% | 0.29% | 0.33% | 63.76% |
| text_encoder.text_model.encoder.layers.9.self_attn.out_proj.bias | 0.02% | 12.17% | 7.21% | 2.68% | 0.06% | 0.45% | 13.82% | 63.60% |
| text_encoder.text_model.encoder.layers.9.layer_norm1.weight | 0.00% | 93.47% | 0.15% | 0.98% | 5.39% | 0.00% | 0.00% | 0.01% |
| text_encoder.text_model.encoder.layers.9.layer_norm1.bias | 26.58% | 0.03% | 0.16% | 0.00% | 13.38% | 27.98% | 16.95% | 14.90% |
| text_encoder.text_model.encoder.layers.9.mlp.fc1.weight | 0.00% | 0.00% | 0.00% | 0.06% | 0.00% | 0.42% | 0.47% | 99.06% |
| text_encoder.text_model.encoder.layers.9.mlp.fc1.bias | 95.33% | 0.00% | 1.63% | 0.00% | 0.00% | 2.99% | 0.00% | 0.04% |
| text_encoder.text_model.encoder.layers.9.mlp.fc2.weight | 0.02% | 3.08% | 0.54% | 0.01% | 31.79% | 64.42% | 0.03% | 0.12% |
| text_encoder.text_model.encoder.layers.9.mlp.fc2.bias | 0.00% | 0.03% | 0.00% | 2.35% | 0.01% | 0.03% | 0.00% | 97.57% |
| text_encoder.text_model.encoder.layers.9.layer_norm2.weight | 0.00% | 0.00% | 0.03% | 0.00% | 0.00% | 0.02% | 0.00% | 99.95% |
| text_encoder.text_model.encoder.layers.9.layer_norm2.bias | 0.03% | 0.05% | 23.71% | 0.00% | 0.01% | 0.65% | 75.54% | 0.01% |
| text_encoder.text_model.encoder.layers.10.self_attn.k_proj.weight | 0.00% | 0.76% | 5.94% | 0.01% | 16.30% | 76.85% | 0.14% | 0.00% |
| text_encoder.text_model.encoder.layers.10.self_attn.k_proj.bias | 0.00% | 0.19% | 0.06% | 1.15% | 98.59% | 0.00% | 0.01% | 0.00% |
| text_encoder.text_model.encoder.layers.10.self_attn.v_proj.weight | 11.69% | 19.11% | 0.36% | 11.61% | 25.92% | 0.00% | 31.31% | 0.00% |
| text_encoder.text_model.encoder.layers.10.self_attn.v_proj.bias | 2.01% | 11.63% | 7.13% | 0.00% | 76.99% | 2.14% | 0.00% | 0.11% |
| text_encoder.text_model.encoder.layers.10.self_attn.q_proj.weight | 0.01% | 44.78% | 54.90% | 0.00% | 0.00% | 0.30% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.10.self_attn.q_proj.bias | 0.00% | 0.03% | 0.00% | 0.00% | 99.78% | 0.13% | 0.00% | 0.05% |
| text_encoder.text_model.encoder.layers.10.self_attn.out_proj.weight | 0.10% | 4.95% | 0.00% | 0.00% | 8.17% | 37.74% | 42.38% | 6.67% |
| text_encoder.text_model.encoder.layers.10.self_attn.out_proj.bias | 0.01% | 0.00% | 0.00% | 1.62% | 43.86% | 25.73% | 0.00% | 28.78% |
| text_encoder.text_model.encoder.layers.10.layer_norm1.weight | 0.00% | 10.58% | 63.79% | 21.51% | 0.35% | 0.00% | 3.01% | 0.76% |
| text_encoder.text_model.encoder.layers.10.layer_norm1.bias | 0.00% | 0.14% | 4.56% | 11.98% | 38.60% | 0.00% | 44.54% | 0.17% |
| text_encoder.text_model.encoder.layers.10.mlp.fc1.weight | 6.28% | 93.18% | 0.00% | 0.00% | 0.00% | 0.55% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.10.mlp.fc1.bias | 0.00% | 0.04% | 0.00% | 88.89% | 11.06% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.10.mlp.fc2.weight | 0.00% | 0.00% | 0.16% | 1.35% | 0.00% | 98.49% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.10.mlp.fc2.bias | 0.00% | 0.00% | 0.00% | 0.00% | 0.00% | 100.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.10.layer_norm2.weight | 0.00% | 0.00% | 97.77% | 0.00% | 2.23% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.10.layer_norm2.bias | 0.23% | 1.57% | 88.56% | 0.00% | 1.01% | 8.59% | 0.04% | 0.00% |
| text_encoder.text_model.encoder.layers.11.self_attn.k_proj.weight | 0.00% | 0.00% | 0.00% | 0.00% | 1.29% | 0.01% | 72.96% | 25.74% |
| text_encoder.text_model.encoder.layers.11.self_attn.k_proj.bias | 0.00% | 0.10% | 1.35% | 1.27% | 97.15% | 0.02% | 0.12% | 0.00% |
| text_encoder.text_model.encoder.layers.11.self_attn.v_proj.weight | 0.00% | 0.05% | 0.00% | 0.00% | 22.96% | 0.00% | 0.00% | 76.98% |
| text_encoder.text_model.encoder.layers.11.self_attn.v_proj.bias | 5.45% | 0.00% | 0.29% | 1.05% | 92.42% | 0.40% | 0.34% | 0.04% |
| text_encoder.text_model.encoder.layers.11.self_attn.q_proj.weight | 88.42% | 0.00% | 0.00% | 0.00% | 11.58% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.11.self_attn.q_proj.bias | 0.00% | 0.00% | 4.99% | 0.11% | 0.03% | 0.00% | 94.87% | 0.00% |
| text_encoder.text_model.encoder.layers.11.self_attn.out_proj.weight | 0.00% | 0.00% | 0.00% | 99.29% | 0.69% | 0.02% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.11.self_attn.out_proj.bias | 0.04% | 87.06% | 6.49% | 0.00% | 5.12% | 1.29% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.11.layer_norm1.weight | 0.01% | 0.00% | 0.00% | 5.65% | 73.64% | 0.01% | 0.25% | 20.44% |
| text_encoder.text_model.encoder.layers.11.layer_norm1.bias | 8.35% | 37.85% | 0.00% | 2.99% | 50.00% | 0.07% | 0.00% | 0.75% |
| text_encoder.text_model.encoder.layers.11.mlp.fc1.weight | 0.00% | 0.00% | 0.00% | 39.74% | 2.97% | 53.89% | 0.00% | 3.40% |
| text_encoder.text_model.encoder.layers.11.mlp.fc1.bias | 0.18% | 12.53% | 3.33% | 0.00% | 79.77% | 4.16% | 0.00% | 0.02% |
| text_encoder.text_model.encoder.layers.11.mlp.fc2.weight | 58.92% | 0.00% | 1.61% | 0.00% | 0.00% | 38.72% | 0.03% | 0.72% |
| text_encoder.text_model.encoder.layers.11.mlp.fc2.bias | 67.08% | 0.00% | 0.00% | 32.09% | 0.55% | 0.27% | 0.00% | 0.00% |
| text_encoder.text_model.encoder.layers.11.layer_norm2.weight | 0.00% | 0.00% | 91.02% | 0.00% | 0.01% | 0.00% | 0.00% | 8.95% |
| text_encoder.text_model.encoder.layers.11.layer_norm2.bias | 0.23% | 0.00% | 0.00% | 0.00% | 99.76% | 0.00% | 0.00% | 0.00% |
| text_encoder.text_model.final_layer_norm.weight | 0.00% | 1.42% | 0.23% | 0.00% | 0.00% | 0.00% | 1.08% | 97.27% |
| text_encoder.text_model.final_layer_norm.bias | 0.00% | 0.95% | 35.66% | 62.38% | 0.69% | 0.15% | 0.18% | 0.00% |