sayakpaul HF staff commited on
Commit
b7823b4
1 Parent(s): 1632490

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +2 -2
app.py CHANGED
@@ -220,8 +220,8 @@ with gr.Blocks(theme=gr.themes.Soft()) as demo:
220
  """<img src="https://huggingface.co/spaces/hf-accelerate/model-memory-usage/resolve/main/measure_model_size.png" style="float: left;" width="150" height="175"><h1>🧨 Diffusers Pipeline Memory Calculator</h1>
221
  This tool will help you to gauge the memory requirements of a Diffusers pipeline. Pipelines containing text encoders with sharded checkpoints are also supported
222
  (PixArt-Alpha, for example) 🤗 See instructions below the form on how to pass `controlnet_id` or `t2_adapter_id`. When performing inference, expect to add up to an
223
- additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/). You can click on one of the examples below the "Calculate Memory Usage" button
224
- to get started. Design adapted from [this Space](https://huggingface.co/spaces/hf-accelerate/model-memory-usage).
225
  """
226
  )
227
  out_text = gr.Markdown()
 
220
  """<img src="https://huggingface.co/spaces/hf-accelerate/model-memory-usage/resolve/main/measure_model_size.png" style="float: left;" width="150" height="175"><h1>🧨 Diffusers Pipeline Memory Calculator</h1>
221
  This tool will help you to gauge the memory requirements of a Diffusers pipeline. Pipelines containing text encoders with sharded checkpoints are also supported
222
  (PixArt-Alpha, for example) 🤗 See instructions below the form on how to pass `controlnet_id` or `t2_adapter_id`. When performing inference, expect to add up to an
223
+ additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/). The final memory requirement will also depend on the requested resolution. You can click on one
224
+ of the examples below the "Calculate Memory Usage" button to get started. Design adapted from [this Space](https://huggingface.co/spaces/hf-accelerate/model-memory-usage).
225
  """
226
  )
227
  out_text = gr.Markdown()