Spaces:
Runtime error
Runtime error
""" | |
This is a file holding model-specific suggestions. | |
How to add a new suggestion: | |
1. Add a new constant at the bottom of the file with your suggestion. Please try to follow the same format as the | |
existing suggestions. | |
2. Add a new entry to the `MODEL_SUGGESTIONS`, with format `(model tag, (problem tags,), suggestion constant)`. | |
a. Make sure the model tag matches the exact same tag as on the Hub (e.g. GPT-J is `gptj`) | |
b. See `app.py` for the existing problem tags. | |
c. Make sure the problem tags are a tuple. | |
""" | |
GPTJ_USE_SAMPLING = """ | |
<details><summary>{match_emoji} {count}. GPT-J - Avoid using Greedy Search and Beam Search.</summary> | |
| |
🤔 Why? | |
According to its creators, "generating without sampling was actually surprisingly suboptimal". | |
🤗 How? | |
Our text generation interfaces accept a `do_sample` argument. Set it to `True` to ensure sampling-based strategies | |
are used. | |
💡 Source | |
1. [This tweet](https://twitter.com/EricHallahan/status/1627785461723721729) by a core member of EleutherAI, the | |
creator of GPT-J | |
_________________ | |
</details> | |
""" | |
T5_FLOAT16 = """ | |
<details><summary>{match_emoji} {count}. T5 - If you're using int8 or float16, make sure you have `transformers>=4.26.1`.</summary> | |
| |
🤔 Why? | |
In a nutshell, some layers in T5 don't work well in lower precision unless they are in bf16. Newer versions of | |
`transformers` take care of upcasting the layers when needed. | |
🤗 How? | |
Make sure the dependencies in your workflow have `transformers>=4.26.1` | |
💡 Source | |
1. See [this thread](https://github.com/huggingface/transformers/issues/20287) for the full discussion. | |
_________________ | |
</details> | |
""" | |
MODEL_SUGGESTIONS = ( | |
("gptj", ("quality",), GPTJ_USE_SAMPLING), | |
("t5", ("quality", "baseline", "speed"), T5_FLOAT16), | |
) | |
assert all(isinstance(problem_tags, tuple) for _, problem_tags, _ in MODEL_SUGGESTIONS) | |