mrseeker87 commited on
Commit
8cf4e3d
1 Parent(s): 8bb4c4a

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +38 -0
README.md ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # GPT-Neo 2.7B - Picard
2
+ ## Model Description
3
+ GPT-Neo 2.7B-Picard is a finetune created using EleutherAI's GPT-Neo 2.7B model.
4
+ ## Training data
5
+ The training data contains around 1800 ebooks, mostly in the sci-fi and fantasy genres.
6
+ ### How to use
7
+ You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
8
+ ```py
9
+ >>> from transformers import pipeline
10
+ >>> generator = pipeline('text-generation', model='mrseeker87/GPT-Neo-2.7B-Picard')
11
+ >>> generator("Jean-Luc Picard", do_sample=True, min_length=50)
12
+ [{'generated_text': 'Jean-Luc Picard, the captain of a Federation starship in command of one of Starfleet's few fulltime scientists.'}]
13
+ ```
14
+ ### Limitations and Biases
15
+ GPT-Neo was trained as an autoregressive language model. This means that its core functionality is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work.
16
+ GPT-Neo was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending on your usecase GPT-Neo may produce socially unacceptable text. See Sections 5 and 6 of the Pile paper for a more detailed analysis of the biases in the Pile.
17
+ As with all language models, it is hard to predict in advance how GPT-Neo will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
18
+ ### BibTeX entry and citation info
19
+ The model is made using the following software:
20
+ ```bibtex
21
+ @software{gpt-neo,
22
+ author = {Black, Sid and
23
+ Leo, Gao and
24
+ Wang, Phil and
25
+ Leahy, Connor and
26
+ Biderman, Stella},
27
+ title = {{GPT-Neo: Large Scale Autoregressive Language
28
+ Modeling with Mesh-Tensorflow}},
29
+ month = mar,
30
+ year = 2021,
31
+ note = {{If you use this software, please cite it using
32
+ these metadata.}},
33
+ publisher = {Zenodo},
34
+ version = {1.0},
35
+ doi = {10.5281/zenodo.5297715},
36
+ url = {https://doi.org/10.5281/zenodo.5297715}
37
+ }
38
+ ```