AdamCodd commited on
Commit
fab03cb
1 Parent(s): ff4075b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -32,7 +32,7 @@ tags:
32
  - art
33
  ---
34
  ## t5-small-negative-prompt-generator
35
- This model [t5-small](https://huggingface.co/google-t5/t5-small) has been finetuned on a subset of the [AdamCodd/Civitai-8m-prompts](https://huggingface.co/datasets/AdamCodd/Civitai-8m-prompts) dataset (~800K prompts) focused on the top 10% prompts according to Civitai's positive engagement ("stats" field in the dataset). The dataset includes negative embeddings and thus the model will output them.
36
 
37
  It achieves the following results on the evaluation set:
38
  * Loss: 0.1730
@@ -47,7 +47,7 @@ The license is **cc-by-nc-4.0**. For commercial use rights, please [contact me](
47
 
48
  ## Usage
49
 
50
- The length of the negative prompt is adjustable with the `max_new_tokens` parameter. Keep in mind that you'll need to adjust the samplers slightly to avoid repetition and improve the quality of the output.
51
 
52
  ```python
53
  from transformers import pipeline
 
32
  - art
33
  ---
34
  ## t5-small-negative-prompt-generator
35
+ This model [t5-small](https://huggingface.co/google-t5/t5-small) has been finetuned on a subset of the [AdamCodd/Civitai-8m-prompts](https://huggingface.co/datasets/AdamCodd/Civitai-8m-prompts) dataset (~800K prompts) focused on the top 10% prompts according to Civitai's positive engagement ("stats" field in the dataset).
36
 
37
  It achieves the following results on the evaluation set:
38
  * Loss: 0.1730
 
47
 
48
  ## Usage
49
 
50
+ The length of the negative prompt is adjustable with the `max_new_tokens` parameter. Keep in mind that you'll need to adjust the samplers slightly to avoid repetition and improve the quality of the output. The dataset includes negative embeddings, so you'll find them in the output.
51
 
52
  ```python
53
  from transformers import pipeline