system HF staff commited on
Commit
d9fb921
1 Parent(s): de16bbb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -11
README.md CHANGED
@@ -2,7 +2,6 @@
2
  language: en
3
  thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true
4
  tags:
5
- - exbert
6
  - huggingtweets
7
  widget:
8
  - text: "My dream is"
@@ -19,10 +18,6 @@ I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
19
 
20
  Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
21
 
22
- <a href="https://huggingface.co/exbert/?model=huggingtweets/julien_c&modelKind=autoregressive&sentence=I%20love%20huggingtweets!&layer=11">
23
- <img width="300px" src="https://hf-dinosaur.huggingface.co/exbert/button.png">
24
- </a>
25
-
26
  ## How does it work?
27
 
28
  The model uses the following pipeline.
@@ -37,18 +32,18 @@ The model was trained on [@julien_c's tweets](https://twitter.com/julien_c).
37
 
38
  | Data | Quantity |
39
  |-------------------|--------------|
40
- | Tweets downloaded | 3219 |
41
- | Retweets | 874 |
42
- | Short tweets | 386 |
43
  | Tweets kept | 1959 |
44
 
45
- [Explore the data](https://app.wandb.ai/wandb/huggingtweets-dev/runs/bf90ifto/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
46
 
47
  ## Training procedure
48
 
49
  The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @julien_c's tweets.
50
 
51
- Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets-dev/runs/nati5rk5) for full transparency and reproducibility.
52
 
53
  ## Intended uses & limitations
54
 
@@ -58,7 +53,8 @@ You can use this model directly with a pipeline for text generation:
58
 
59
  ```python
60
  from transformers import pipeline
61
- generator = pipeline('text-generation', model='huggingtweets/julien_c')
 
62
  generator("My dream is", max_length=50, num_return_sequences=5)
63
  ```
64
 
 
2
  language: en
3
  thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true
4
  tags:
 
5
  - huggingtweets
6
  widget:
7
  - text: "My dream is"
 
18
 
19
  Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
20
 
 
 
 
 
21
  ## How does it work?
22
 
23
  The model uses the following pipeline.
 
32
 
33
  | Data | Quantity |
34
  |-------------------|--------------|
35
+ | Tweets downloaded | 3222 |
36
+ | Retweets | 878 |
37
+ | Short tweets | 385 |
38
  | Tweets kept | 1959 |
39
 
40
+ [Explore the data](https://app.wandb.ai/wandb/huggingtweets-dev/runs/2y6b11gn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
41
 
42
  ## Training procedure
43
 
44
  The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @julien_c's tweets.
45
 
46
+ Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets-dev/runs/976325ig) for full transparency and reproducibility.
47
 
48
  ## Intended uses & limitations
49
 
 
53
 
54
  ```python
55
  from transformers import pipeline
56
+ generator = pipeline('text-generation',
57
+ model='huggingtweets/julien_c')
58
  generator("My dream is", max_length=50, num_return_sequences=5)
59
  ```
60