sayakpaul HF staff commited on
Commit
0c24640
1 Parent(s): 78f284a

usage notes

Browse files
Files changed (1) hide show
  1. app.py +2 -1
app.py CHANGED
@@ -208,7 +208,8 @@ with gr.Blocks(theme=gr.themes.Soft()) as demo:
208
  """<img src="https://huggingface.co/spaces/hf-accelerate/model-memory-usage/resolve/main/measure_model_size.png" style="float: left;" width="150" height="175"><h1>🧨 Diffusers Pipeline Memory Calculator</h1>
209
  This tool will help you to gauge the memory requirements of a Diffusers pipeline. Pipelines containing text encoders with sharded checkpoints are also supported
210
  (PixArt-Alpha, for example) 🤗 See instructions below the form on how to pass `controlnet_id` or `t2_adapter_id`. When performing inference, expect to add up to an
211
- additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/). Design adapted from [this Space](https://huggingface.co/spaces/hf-accelerate/model-memory-usage).
 
212
  """
213
  )
214
  out_text = gr.Markdown()
 
208
  """<img src="https://huggingface.co/spaces/hf-accelerate/model-memory-usage/resolve/main/measure_model_size.png" style="float: left;" width="150" height="175"><h1>🧨 Diffusers Pipeline Memory Calculator</h1>
209
  This tool will help you to gauge the memory requirements of a Diffusers pipeline. Pipelines containing text encoders with sharded checkpoints are also supported
210
  (PixArt-Alpha, for example) 🤗 See instructions below the form on how to pass `controlnet_id` or `t2_adapter_id`. When performing inference, expect to add up to an
211
+ additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/). You can click on one of the examples below the "Calculate Memory Usage" button
212
+ to get started. Design adapted from [this Space](https://huggingface.co/spaces/hf-accelerate/model-memory-usage).
213
  """
214
  )
215
  out_text = gr.Markdown()