Update README.md
Browse files
README.md
CHANGED
@@ -2,7 +2,6 @@
|
|
2 |
language: en
|
3 |
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true
|
4 |
tags:
|
5 |
-
- exbert
|
6 |
- huggingtweets
|
7 |
widget:
|
8 |
- text: "My dream is"
|
@@ -19,10 +18,6 @@ I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
|
|
19 |
|
20 |
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
|
21 |
|
22 |
-
<a href="https://huggingface.co/exbert/?model=huggingtweets/l2k&modelKind=autoregressive&sentence=I%20love%20huggingtweets!&layer=11">
|
23 |
-
<img width="300px" src="https://hf-dinosaur.huggingface.co/exbert/button.png">
|
24 |
-
</a>
|
25 |
-
|
26 |
## How does it work?
|
27 |
|
28 |
The model uses the following pipeline.
|
@@ -37,18 +32,18 @@ The model was trained on [@l2k's tweets](https://twitter.com/l2k).
|
|
37 |
|
38 |
| Data | Quantity |
|
39 |
|-------------------|--------------|
|
40 |
-
| Tweets downloaded |
|
41 |
-
| Retweets |
|
42 |
-
| Short tweets |
|
43 |
-
| Tweets kept |
|
44 |
|
45 |
-
[Explore the data](https://app.wandb.ai/wandb/huggingtweets-dev/runs/
|
46 |
|
47 |
## Training procedure
|
48 |
|
49 |
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @l2k's tweets.
|
50 |
|
51 |
-
Hyperparameters and metrics are recorded in the [W&B training run](
|
52 |
|
53 |
## Intended uses & limitations
|
54 |
|
@@ -58,7 +53,8 @@ You can use this model directly with a pipeline for text generation:
|
|
58 |
|
59 |
```python
|
60 |
from transformers import pipeline
|
61 |
-
generator = pipeline('text-generation',
|
|
|
62 |
generator("My dream is", max_length=50, num_return_sequences=5)
|
63 |
```
|
64 |
|
|
|
2 |
language: en
|
3 |
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true
|
4 |
tags:
|
|
|
5 |
- huggingtweets
|
6 |
widget:
|
7 |
- text: "My dream is"
|
|
|
18 |
|
19 |
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
|
20 |
|
|
|
|
|
|
|
|
|
21 |
## How does it work?
|
22 |
|
23 |
The model uses the following pipeline.
|
|
|
32 |
|
33 |
| Data | Quantity |
|
34 |
|-------------------|--------------|
|
35 |
+
| Tweets downloaded | 2541 |
|
36 |
+
| Retweets | 578 |
|
37 |
+
| Short tweets | 87 |
|
38 |
+
| Tweets kept | 1876 |
|
39 |
|
40 |
+
[Explore the data](https://app.wandb.ai/wandb/huggingtweets-dev/runs/18jzfgqc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
|
41 |
|
42 |
## Training procedure
|
43 |
|
44 |
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @l2k's tweets.
|
45 |
|
46 |
+
Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets-dev/runs/2ly0pm0j) for full transparency and reproducibility.
|
47 |
|
48 |
## Intended uses & limitations
|
49 |
|
|
|
53 |
|
54 |
```python
|
55 |
from transformers import pipeline
|
56 |
+
generator = pipeline('text-generation',
|
57 |
+
model='huggingtweets/l2k')
|
58 |
generator("My dream is", max_length=50, num_return_sequences=5)
|
59 |
```
|
60 |
|