Tincando commited on
Commit
78b6f2f
1 Parent(s): 519d0b0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -30
README.md CHANGED
@@ -9,39 +9,40 @@ model-index:
9
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
  should probably proofread and complete it, then remove this comment. -->
11
 
12
- # GPT-Neo for Fiction Story Generation
13
 
14
- This model is a fine-tuned version of EleutherAI's GPT-Neo-125M model, optimized for generating fiction stories.
15
- It has been trained on the dataset available at https://github.com/facebookresearch/fairseq/tree/main/examples/stories.
16
 
17
 
18
- ## Model description
19
 
20
- - Model Name: GPT-Neo-Fiction
21
- - Fine-Tuned By: Tin Kanjovsky/Tincando, Darko Etinger
22
- - Model Version: 1.0
 
23
 
24
- ## Intended uses & limitations
25
 
26
- The model is designed for creative fiction story generation. It can be used for various purposes, including but not limited to:
27
 
28
- - Storytelling: Generate engaging and imaginative fiction stories.
29
- - Content Generation: Create content for blogs, websites, or other media with a storytelling element.
30
- - Creative Writing: Assist authors and writers in brainstorming ideas and expanding narratives.
31
 
32
- ## Model Performance
33
 
34
- - Training Data: The model was trained on a diverse dataset of fiction stories.
35
- - Evaluation Metrics: Performance metrics, such as perplexity or BLEU scores, may vary depending on the specific task and dataset.
36
 
37
- ## Limitations
38
 
39
- - Content Quality: While GPT-Neo-Fiction can generate creative stories, the quality and coherence of the output may vary, and it may occasionally produce nonsensical or inappropriate content.
40
- - Bias: The model may exhibit biases present in the training data, and it is essential to be cautious when using it for sensitive topics or content.
41
- - Length of Output: The model may generate text with variable length, and it may not always produce the desired length of output.
42
- - Fine-Tuning Data: The quality of generated stories is dependent on the quality and diversity of the fine-tuning dataset.
43
-
44
- ## Usage
45
 
46
  ```
47
  from transformers import GPTNeoForCausalLM, GPT2Tokenizer
@@ -67,17 +68,18 @@ generated_story = tokenizer.batch_decode(output,clean_up_tokenization_spaces=Tru
67
  print(generated_story)
68
  ```
69
 
70
- ## Ethical Considerations
 
 
71
 
72
- When using this model, consider the following ethical guidelines:
 
 
 
73
 
74
- - Content Moderation: Implement content moderation to ensure that the generated stories do not violate guidelines or community standards.
75
- - Bias and Fairness: Be aware of potential biases in the model's output and take steps to mitigate them.
76
- - Privacy: Avoid using personal or sensitive information as input prompts.
77
- - Legal Compliance: Ensure that the generated content complies with copyright and intellectual property laws.
78
 
79
- ## Citation
80
- If you use GPT-Neo-Fiction in your work, please consider citing the original GPT-Neo model and the dataset used for fine-tuning:
81
 
82
  - [GPT-Neo Paper](https://github.com/EleutherAI/gpt-neo)
83
  - [Fairseq Repository](https://github.com/facebookresearch/fairseq/tree/main/examples/stories)
 
9
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
  should probably proofread and complete it, then remove this comment. -->
11
 
12
+ # GPT-Neo za Generiranje Fiktivnih Priča/ GPT-Neo for Fiction Story Generation
13
 
14
+ Ovaj model je fino podešena verzija EleutherAI-jevog GPT-Neo-125M modela, optimiziran za generiranje fikcijskih priča.
15
+ Obučen je na skupu podataka dostupnom na https://github.com/facebookresearch/fairseq/tree/main/examples/stories.
16
 
17
 
18
+ ## Opis modela
19
 
20
+ - Naziv Modela: GPT-Neo-Fiction
21
+ - Student: Tin Kanjovsky/Tincando
22
+ - Mentor: izv.prof.dr.sc. Darko Etinger
23
+ - Verzija Modela: 1.0
24
 
25
+ ## Upotrebe i ograničenja
26
 
27
+ Model je dizajniran za generiranje kreativnih fiktivnih priča. Može se koristiti u razne svrhe, uključujući, ali ne ograničavajući se na:
28
 
29
+ - Pripovijedanje: Generiranje zanimljivih i maštovitih fiktivnih priča.
30
+ - Generiranje Sadržaja: Stvaranje sadržaja za blogove, web stranice ili druge medije s elementom pripovijedanja.
31
+ - Kreativno Pisanje: Pomoć autorima i piscima pri razmišljanju o idejama i razvijanju narativa.
32
 
33
+ ## Performanse Modela
34
 
35
+ - Podaci za Obuku: Model je obučen na raznolikom skupu podataka fiktivnih priča i prompteva.
36
+ - Metrike Evaluacije: Performanse metrika, kao što su perpleksnost ili BLEU skorovi, mogu varirati ovisno o konkretnom zadatku i skupu podataka.
37
 
38
+ ## Ograničenja
39
 
40
+ - Kvaliteta Sadržaja: Iako model može generirati kreativne priče, kvaliteta i koherentnost izlaza mogu varirati, a povremeno može proizvesti besmislene ili neprimjerene sadržaje.
41
+ - Pristranost: Model može pokazivati pristranosti prisutne u skupu podataka za obuku, stoga je važno biti oprezan prilikom korištenja za osjetljive teme ili sadržaje.
42
+ - Duljina Izlaza: Model može generirati tekst različite duljine i ne uvijek će proizvesti željenu duljinu izlaza.
43
+ - Podaci za Fino Podešavanje: Kvaliteta generiranih priča ovisi o kvaliteti i raznolikosti skupa podataka za fino podešavanje.
44
+
45
+ ## Upotreba
46
 
47
  ```
48
  from transformers import GPTNeoForCausalLM, GPT2Tokenizer
 
68
  print(generated_story)
69
  ```
70
 
71
+ ## Etika
72
+
73
+ Prilikom korištenja ovog modela, razmotrite sljedeće etičke smjernice:
74
 
75
+ - Moderacija Sadržaja: Implementirajte moderaciju sadržaja kako biste osigurali da generirane priče ne krše smjernice ili standarde zajednice.
76
+ - Pristranost i Pravednost: Budite svjesni potencijalnih pristranosti u izlazu modela i poduzmite korake za njihovo ublažavanje.
77
+ - Privatnost: Izbjegavajte upotrebu osobnih ili osjetljivih informacija kao ulaznih poticaja.
78
+ - Pravna Usklađenost: Pazite da generirani sadržaj bude u skladu s autorskim pravima i zakonima o intelektualnom vlasništvu.
79
 
80
+ ## Citiranje
 
 
 
81
 
82
+ Ako koristite GPT-Neo-Fiction u svojem radu, molimo razmislite o citiranju originalnog GPT-Neo modela i skupa podataka koji su korišteni za fino podešavanje:
 
83
 
84
  - [GPT-Neo Paper](https://github.com/EleutherAI/gpt-neo)
85
  - [Fairseq Repository](https://github.com/facebookresearch/fairseq/tree/main/examples/stories)