boris commited on
Commit
71ac475
1 Parent(s): 0211bb2

New model from https://wandb.ai/wandb/huggingtweets/runs/3papp840

Browse files
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
  language: en
3
- thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true
4
  tags:
5
  - huggingtweets
6
  widget:
@@ -19,7 +19,7 @@ widget:
19
  <section class='prose'>
20
 
21
  <div>
22
- <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1197606280082280448/d1HEh7Lh_400x400.jpg')">
23
  </div>
24
  <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Charles 🎉 Frye 🤖 AI Bot </div>
25
  <div style="font-size: 15px; color: #657786">@charles_irl bot</div>
@@ -35,7 +35,7 @@ The model uses the following pipeline.
35
 
36
  ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true)
37
 
38
- To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf).
39
 
40
  ## Training data
41
 
@@ -51,32 +51,32 @@ The model was trained on [@charles_irl's tweets](https://twitter.com/charles_irl
51
  <tbody style='border-width:0'>
52
  <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
53
  <td style='border-width:0'>Tweets downloaded</td>
54
- <td style='border-width:0'>780</td>
55
  </tr>
56
  <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
57
  <td style='border-width:0'>Retweets</td>
58
- <td style='border-width:0'>168</td>
59
  </tr>
60
  <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
61
  <td style='border-width:0'>Short tweets</td>
62
- <td style='border-width:0'>22</td>
63
  </tr>
64
  <tr style='border-width:0'>
65
  <td style='border-width:0'>Tweets kept</td>
66
- <td style='border-width:0'>590</td>
67
  </tr>
68
  </tbody>
69
  </table>
70
 
71
- [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3g45moaj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
72
 
73
  ## Training procedure
74
 
75
  The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @charles_irl's tweets.
76
 
77
- Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2omraa20) for full transparency and reproducibility.
78
 
79
- At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2omraa20/artifacts) is logged and versioned.
80
 
81
  ## Intended uses & limitations
82
 
@@ -102,7 +102,7 @@ In addition, the data present in the user's tweets further affects the text gene
102
 
103
  </section>
104
 
105
- [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma)
106
 
107
  <section class='prose'>
108
  For more details, visit the project repository.
 
1
  ---
2
  language: en
3
+ thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
4
  tags:
5
  - huggingtweets
6
  widget:
 
19
  <section class='prose'>
20
 
21
  <div>
22
+ <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1320840184494108674/d7A64nIG_400x400.jpg')">
23
  </div>
24
  <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Charles 🎉 Frye 🤖 AI Bot </div>
25
  <div style="font-size: 15px; color: #657786">@charles_irl bot</div>
 
35
 
36
  ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true)
37
 
38
+ To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
39
 
40
  ## Training data
41
 
 
51
  <tbody style='border-width:0'>
52
  <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
53
  <td style='border-width:0'>Tweets downloaded</td>
54
+ <td style='border-width:0'>1207</td>
55
  </tr>
56
  <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
57
  <td style='border-width:0'>Retweets</td>
58
+ <td style='border-width:0'>249</td>
59
  </tr>
60
  <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
61
  <td style='border-width:0'>Short tweets</td>
62
+ <td style='border-width:0'>47</td>
63
  </tr>
64
  <tr style='border-width:0'>
65
  <td style='border-width:0'>Tweets kept</td>
66
+ <td style='border-width:0'>911</td>
67
  </tr>
68
  </tbody>
69
  </table>
70
 
71
+ [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1osnjxmi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
72
 
73
  ## Training procedure
74
 
75
  The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @charles_irl's tweets.
76
 
77
+ Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3papp840) for full transparency and reproducibility.
78
 
79
+ At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3papp840/artifacts) is logged and versioned.
80
 
81
  ## Intended uses & limitations
82
 
 
102
 
103
  </section>
104
 
105
+ [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma)
106
 
107
  <section class='prose'>
108
  For more details, visit the project repository.
config.json CHANGED
@@ -1,4 +1,5 @@
1
  {
 
2
  "activation_function": "gelu_new",
3
  "architectures": [
4
  "GPT2LMHeadModel"
@@ -7,6 +8,7 @@
7
  "bos_token_id": 50256,
8
  "embd_pdrop": 0.1,
9
  "eos_token_id": 50256,
 
10
  "initializer_range": 0.02,
11
  "layer_norm_epsilon": 1e-05,
12
  "model_type": "gpt2",
@@ -32,5 +34,7 @@
32
  "top_p": 0.95
33
  }
34
  },
 
 
35
  "vocab_size": 50257
36
  }
 
1
  {
2
+ "_name_or_path": "gpt2",
3
  "activation_function": "gelu_new",
4
  "architectures": [
5
  "GPT2LMHeadModel"
 
8
  "bos_token_id": 50256,
9
  "embd_pdrop": 0.1,
10
  "eos_token_id": 50256,
11
+ "gradient_checkpointing": false,
12
  "initializer_range": 0.02,
13
  "layer_norm_epsilon": 1e-05,
14
  "model_type": "gpt2",
 
34
  "top_p": 0.95
35
  }
36
  },
37
+ "transformers_version": "4.2.0",
38
+ "use_cache": true,
39
  "vocab_size": 50257
40
  }
merges.txt CHANGED
@@ -1,4 +1,4 @@
1
- #version: 0.2
2
  Ġ t
3
  Ġ a
4
  h e
 
1
+ #version: 0.2 - Trained by `huggingface/tokenizers`
2
  Ġ t
3
  Ġ a
4
  h e
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:781a51f9c7f110460d427fa53ebbd8b123cbd51c775f04dccd5ab18f02c9bf13
3
- size 510408315
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bd0563c013557ffea64aa7a506a7dce4583e8f7014ac75c9305dd6513f832cd3
3
+ size 510406559
special_tokens_map.json CHANGED
@@ -1 +1 @@
1
- {"bos_token": {"content": "<|endoftext|>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "eos_token": {"content": "<|endoftext|>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}, "unk_token": {"content": "<|endoftext|>", "single_word": false, "lstrip": false, "rstrip": false, "normalized": true}}
 
1
+ {"bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "unk_token": "<|endoftext|>"}
tokenizer_config.json CHANGED
@@ -1 +1 @@
1
- {"model_max_length": 1024}
 
1
+ {"unk_token": "<|endoftext|>", "bos_token": "<|endoftext|>", "eos_token": "<|endoftext|>", "add_prefix_space": false, "model_max_length": 1024, "name_or_path": "gpt2"}
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4a79c2df2cdf86448780f28021b3314537b2df10c5641745037c6b52b81f195e
3
+ size 2031
vocab.json CHANGED
The diff for this file is too large to render. See raw diff