mgladden commited on
Commit
0224bc5
1 Parent(s): c77c8cc

Upload model

Browse files
Files changed (4) hide show
  1. README.md +11 -9
  2. config.json +1 -1
  3. generation_config.json +1 -1
  4. tf_model.h5 +1 -1
README.md CHANGED
@@ -1,18 +1,20 @@
1
  ---
2
- license: apache-2.0
3
  tags:
4
- - management
5
- - text generation
6
  model-index:
7
  - name: ManaGPT-1010
8
  results: []
9
- language:
10
- - en
11
  ---
12
 
 
 
 
13
  # ManaGPT-1010
14
- <img style="float:right; margin-right:30px" src="https://huggingface.co/NeuraXenetica/ManaGPT-1010/resolve/main/ManaGPT_logo_01.png" width="150" height="150">
15
- This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on a custom dataset.
 
 
16
 
17
  ## Model description
18
 
@@ -40,7 +42,7 @@ The following hyperparameters were used during training:
40
 
41
  ### Framework versions
42
 
43
- - Transformers 4.27.1
44
  - TensorFlow 2.11.0
45
  - Datasets 2.10.1
46
- - Tokenizers 0.13.2
 
1
  ---
2
+ license: mit
3
  tags:
4
+ - generated_from_keras_callback
 
5
  model-index:
6
  - name: ManaGPT-1010
7
  results: []
 
 
8
  ---
9
 
10
+ <!-- This model card has been generated automatically according to the information Keras had access to. You should
11
+ probably proofread and complete it, then remove this comment. -->
12
+
13
  # ManaGPT-1010
14
+
15
+ This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
16
+ It achieves the following results on the evaluation set:
17
+
18
 
19
  ## Model description
20
 
 
42
 
43
  ### Framework versions
44
 
45
+ - Transformers 4.27.2
46
  - TensorFlow 2.11.0
47
  - Datasets 2.10.1
48
+ - Tokenizers 0.13.2
config.json CHANGED
@@ -35,7 +35,7 @@
35
  "max_length": 50
36
  }
37
  },
38
- "transformers_version": "4.27.1",
39
  "use_cache": true,
40
  "vocab_size": 50257
41
  }
 
35
  "max_length": 50
36
  }
37
  },
38
+ "transformers_version": "4.27.2",
39
  "use_cache": true,
40
  "vocab_size": 50257
41
  }
generation_config.json CHANGED
@@ -5,5 +5,5 @@
5
  "eos_token_id": 50256,
6
  "max_length": 50,
7
  "pad_token_id": 50256,
8
- "transformers_version": "4.27.1"
9
  }
 
5
  "eos_token_id": 50256,
6
  "max_length": 50,
7
  "pad_token_id": 50256,
8
+ "transformers_version": "4.27.2"
9
  }
tf_model.h5 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:cb46ed853ac5704b211c087abcd299d904b9f69b2b959d84e6a00e17a546b931
3
  size 497935440
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5b4c1d25cec73df540aefddb6441e4efb11b1adf598cea7cd0827936cf8d2894
3
  size 497935440