mrseeker87
commited on
Commit
•
e2dc066
1
Parent(s):
b5cd411
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,43 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# GPT-Neo 2.7B - Shinen
|
2 |
+
## Model Description
|
3 |
+
GPT-Neo 2.7B-Shinen is a finetune created using EleutherAI's GPT-Neo 2.7B model. Compared to GPT-Neo-2.7-Horni, this model is much heavier on the sexual content.
|
4 |
+
*Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.*
|
5 |
+
## Training data
|
6 |
+
The training data contains user-generated stories from sexstories.com. All stories are tagged using the following way:
|
7 |
+
```
|
8 |
+
[Theme: <theme1>, <theme2> ,<theme3>]
|
9 |
+
<Story goes here>
|
10 |
+
```
|
11 |
+
### How to use
|
12 |
+
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
|
13 |
+
```py
|
14 |
+
>>> from transformers import pipeline
|
15 |
+
>>> generator = pipeline('text-generation', model='KoboldAI/GPT-Neo-2.7B-Shinen')
|
16 |
+
>>> generator("She was staring at me", do_sample=True, min_length=50)
|
17 |
+
[{'generated_text': 'She was staring at me with a look that said it all. She wanted me so badly tonight that I wanted'}]
|
18 |
+
```
|
19 |
+
### Limitations and Biases
|
20 |
+
GPT-Neo was trained as an autoregressive language model. This means that its core functionality is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work.
|
21 |
+
GPT-Neo-Shinen was trained on a dataset known to contain profanity, lewd, and otherwise abrasive language. GPT-Neo-Shinen *WILL* produce socially unacceptable text without warning.
|
22 |
+
GPT-Neo-Shinen will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
|
23 |
+
### BibTeX entry and citation info
|
24 |
+
The model is made using the following software:
|
25 |
+
```bibtex
|
26 |
+
@software{gpt-neo,
|
27 |
+
author = {Black, Sid and
|
28 |
+
Leo, Gao and
|
29 |
+
Wang, Phil and
|
30 |
+
Leahy, Connor and
|
31 |
+
Biderman, Stella},
|
32 |
+
title = {{GPT-Neo: Large Scale Autoregressive Language
|
33 |
+
Modeling with Mesh-Tensorflow}},
|
34 |
+
month = mar,
|
35 |
+
year = 2021,
|
36 |
+
note = {{If you use this software, please cite it using
|
37 |
+
these metadata.}},
|
38 |
+
publisher = {Zenodo},
|
39 |
+
version = {1.0},
|
40 |
+
doi = {10.5281/zenodo.5297715},
|
41 |
+
url = {https://doi.org/10.5281/zenodo.5297715}
|
42 |
+
}
|
43 |
+
```
|