IFIS_ZORK_AI_SCIFI / README.md
1 ---
2 license: mit
3 tags:
4 - generated_from_trainer
5 model_index:
6 - name: IFIS_ZORK_AI_SCIFI
7 results:
8 - task:
9 name: Causal Language Modeling
10 type: text-generation
11 ---
12
13 <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14 should probably proofread and complete it, then remove this comment. -->
15
16 # IFIS_ZORK_AI_SCIFI
17
18 This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unkown dataset.
19
20 ## Model description
21
22 More information needed
23
24 ## Intended uses & limitations
25
26 More information needed
27
28 ## Training and evaluation data
29
30 More information needed
31
32 ## Training procedure
33
34 ### Training hyperparameters
35
36 The following hyperparameters were used during training:
37 - learning_rate: 5e-05
38 - train_batch_size: 8
39 - eval_batch_size: 16
40 - seed: 42
41 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42 - lr_scheduler_type: linear
43 - lr_scheduler_warmup_steps: 200
44 - num_epochs: 3
45
46 ### Training results
47
48
49
50 ### Framework versions
51
52 - Transformers 4.8.2
53 - Pytorch 1.9.0+cu102
54 - Tokenizers 0.10.3
55