add attribution to the original space
Browse files
app.py
CHANGED
@@ -208,7 +208,7 @@ with gr.Blocks(theme=gr.themes.Soft()) as demo:
|
|
208 |
"""<img src="https://huggingface.co/spaces/hf-accelerate/model-memory-usage/resolve/main/measure_model_size.png" style="float: left;" width="150" height="175"><h1>🧨 Diffusers Pipeline Memory Calculator</h1>
|
209 |
This tool will help you to gauge the memory requirements of a Diffusers pipeline. Pipelines containing text encoders with sharded checkpoints are also supported
|
210 |
(PixArt-Alpha, for example) 🤗 See instructions below the form on how to pass `controlnet_id` or `t2_adapter_id`. When performing inference, expect to add up to an
|
211 |
-
additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/).
|
212 |
"""
|
213 |
)
|
214 |
out_text = gr.Markdown()
|
|
|
208 |
"""<img src="https://huggingface.co/spaces/hf-accelerate/model-memory-usage/resolve/main/measure_model_size.png" style="float: left;" width="150" height="175"><h1>🧨 Diffusers Pipeline Memory Calculator</h1>
|
209 |
This tool will help you to gauge the memory requirements of a Diffusers pipeline. Pipelines containing text encoders with sharded checkpoints are also supported
|
210 |
(PixArt-Alpha, for example) 🤗 See instructions below the form on how to pass `controlnet_id` or `t2_adapter_id`. When performing inference, expect to add up to an
|
211 |
+
additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/). Design adapted from [this Space](https://huggingface.co/spaces/hf-accelerate/model-memory-usage).
|
212 |
"""
|
213 |
)
|
214 |
out_text = gr.Markdown()
|