system HF staff commited on
Commit
0cef10e
1 Parent(s): 31a27ca

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +79 -0
README.md ADDED
@@ -0,0 +1,79 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true
4
+ tags:
5
+ - exbert
6
+ - huggingtweets
7
+ widget:
8
+ - text: "My dream is"
9
+ ---
10
+
11
+ <div>
12
+ <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1278494692053618688/FEsN6_IF_400x400.jpg')">
13
+ </div>
14
+ <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Lavanya 🤖 AI Bot </div>
15
+ <div style="font-size: 15px; color: #657786">@lavanyaai bot</div>
16
+ </div>
17
+
18
+ I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
19
+
20
+ Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
21
+
22
+ <a href="https://huggingface.co/exbert/?model=huggingtweets/lavanyaai&modelKind=autoregressive&sentence=I%20love%20huggingtweets!&layer=11">
23
+ <img width="300px" src="https://hf-dinosaur.huggingface.co/exbert/button.png">
24
+ </a>
25
+
26
+ ## How does it work?
27
+
28
+ The model uses the following pipeline.
29
+
30
+ ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true)
31
+
32
+ To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf).
33
+
34
+ ## Training data
35
+
36
+ The model was trained on [@lavanyaai's tweets](https://twitter.com/lavanyaai).
37
+
38
+ | Data | Quantity |
39
+ |-------------------|--------------|
40
+ | Tweets downloaded | 3229 |
41
+ | Retweets | 1570 |
42
+ | Short tweets | 147 |
43
+ | Tweets kept | 1512 |
44
+
45
+ [Explore the data](https://app.wandb.ai/borisd13/huggingtweets/runs/33pvebva/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
46
+
47
+ ## Training procedure
48
+
49
+ The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @lavanyaai's tweets for 4 epochs.
50
+
51
+ Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/borisd13/huggingtweets/runs/2paux5t9).
52
+
53
+ ## Intended uses & limitations
54
+
55
+ #### How to use
56
+
57
+ You can use this model directly with a pipeline for text generation:
58
+
59
+ ```python
60
+ from transformers import pipeline
61
+ generator = pipeline('text-generation', model='huggingtweets/lavanyaai')
62
+ generator("My dream is", max_length=50, num_return_sequences=5)
63
+ ```
64
+
65
+ #### Limitations and bias
66
+
67
+ The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
68
+
69
+ In addition, the data present in the user's tweets further affects the text generated by the model.
70
+
71
+ ## About
72
+
73
+ *Built by Boris Dayma*
74
+
75
+ [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma)
76
+
77
+ For more details, visit the project repository.
78
+
79
+ [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)