shahp7575 commited on
Commit
fe59d59
1 Parent(s): 7766ad6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -31
README.md CHANGED
@@ -1,4 +1,6 @@
1
  ---
 
 
2
  tags:
3
  - spanish
4
  - sentiment
@@ -14,42 +16,31 @@ should probably proofread and complete it, then remove this comment. -->
14
  This model fine-tunes [mrm8488/electricidad-base-discriminator](https://huggingface.co/mrm8488/electricidad-base-discriminator) on [muchocine](https://huggingface.co/datasets/muchocine) dataset for sentiment classification to predict *star_rating*.
15
 
16
 
17
- ## Model description
 
 
 
18
 
19
- More information needed
 
 
20
 
21
- ## Intended uses & limitations
22
 
23
- More information needed
 
 
24
 
25
- ## Training and evaluation data
 
26
 
27
- More information needed
 
28
 
29
- ## Training procedure
 
30
 
31
- ### Training hyperparameters
 
32
 
33
- The following hyperparameters were used during training:
34
- - learning_rate: 2e-05
35
- - train_batch_size: 2
36
- - eval_batch_size: 2
37
- - seed: 42
38
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
39
- - lr_scheduler_type: linear
40
- - num_epochs: 2
41
-
42
- ### Training results
43
-
44
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
45
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
46
- | 1.3945 | 1.0 | 2582 | 1.1709 | 0.5 | 0.4852 | 0.5171 | 0.5 |
47
- | 0.9972 | 2.0 | 5164 | 1.2564 | 0.5161 | 0.5166 | 0.5331 | 0.5161 |
48
-
49
-
50
- ### Framework versions
51
-
52
- - Transformers 4.16.2
53
- - Pytorch 1.10.0+cu111
54
- - Datasets 1.18.3
55
- - Tokenizers 0.11.6
 
1
  ---
2
+ language:
3
+ - es
4
  tags:
5
  - spanish
6
  - sentiment
 
16
  This model fine-tunes [mrm8488/electricidad-base-discriminator](https://huggingface.co/mrm8488/electricidad-base-discriminator) on [muchocine](https://huggingface.co/datasets/muchocine) dataset for sentiment classification to predict *star_rating*.
17
 
18
 
19
+ ### How to use
20
+ The model can be used directly with the HuggingFace `pipeline`.
21
+ ```python
22
+ from transformers import AutoTokenizer, AutoModelWithLMHead
23
 
24
+ tokenizer = AutoTokenizer.from_pretrained("shahp7575/gpt2-horoscopes")
25
+ model = AutoModelWithLMHead.from_pretrained("shahp7575/gpt2-horoscopes")
26
+ ```
27
 
28
+ ### Examples
29
 
30
+ ```python
31
+ from transformers import pipeline
32
+ clf = pipeline('sentiment-analysis', model=model, tokenizer=tokenizer)
33
 
34
+ clf('¡Qué película tan fantástica! ¡Me alegro de haberlo visto!')
35
+ >>> [{'label': '5', 'score': 0.9156607389450073}]
36
 
37
+ clf("La historia y el casting fueron geniales.")
38
+ >>> [{'label': '4', 'score': 0.6666394472122192}]
39
 
40
+ clf("Me gustó pero podría ser mejor.")
41
+ >>> [{'label': '3', 'score': 0.7013391852378845}]
42
 
43
+ clf("dinero tirado en esta pelicula")
44
+ >>> [{'label': '2', 'score': 0.7564149498939514}]
45
 
46
+ ```