aadishhug commited on
Commit
29b9fed
1 Parent(s): 4b5dec2

commit files to HF hub

Browse files
Files changed (1) hide show
  1. README.md +14 -13
README.md CHANGED
@@ -5,19 +5,19 @@ tags:
5
  metrics:
6
  - f1
7
  model-index:
8
- - name: distilbert-tweet-analysis
9
  results: []
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
  should probably proofread and complete it, then remove this comment. -->
14
 
15
- # distilbert-tweet-analysis
16
 
17
- This model is a fine-tuned version of [aadishhug/distilbert-tweet-analysis](https://huggingface.co/aadishhug/distilbert-tweet-analysis) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.4772
20
- - F1: 0.8600
21
 
22
  ## Model description
23
 
@@ -36,26 +36,27 @@ More information needed
36
  ### Training hyperparameters
37
 
38
  The following hyperparameters were used during training:
39
- - learning_rate: 2e-05
40
  - train_batch_size: 8
41
  - eval_batch_size: 8
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
- - num_epochs: 3
46
 
47
  ### Training results
48
 
49
- | Training Loss | Epoch | Step | Validation Loss | F1 |
50
- |:-------------:|:-----:|:----:|:---------------:|:------:|
51
- | No log | 1.0 | 44 | 0.4833 | 0.8467 |
52
- | No log | 2.0 | 88 | 0.4609 | 0.8533 |
53
- | No log | 3.0 | 132 | 0.4772 | 0.8600 |
 
54
 
55
 
56
  ### Framework versions
57
 
58
- - Transformers 4.27.2
59
  - Pytorch 1.13.1+cu116
60
  - Datasets 2.10.1
61
  - Tokenizers 0.13.2
 
5
  metrics:
6
  - f1
7
  model-index:
8
+ - name: distilbert-tweets-analysis1
9
  results: []
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
  should probably proofread and complete it, then remove this comment. -->
14
 
15
+ # distilbert-tweets-analysis1
16
 
17
+ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the Coronavirus dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.5688
20
+ - F1: 0.8335
21
 
22
  ## Model description
23
 
 
36
  ### Training hyperparameters
37
 
38
  The following hyperparameters were used during training:
39
+ - learning_rate: 1e-05
40
  - train_batch_size: 8
41
  - eval_batch_size: 8
42
  - seed: 42
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - num_epochs: 4
46
 
47
  ### Training results
48
 
49
+ | Training Loss | Epoch | Step | Validation Loss | F1 |
50
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
51
+ | 0.6876 | 1.0 | 3602 | 0.7097 | 0.7291 |
52
+ | 0.5213 | 2.0 | 7204 | 0.5700 | 0.7911 |
53
+ | 0.4177 | 3.0 | 10806 | 0.5234 | 0.8307 |
54
+ | 0.3219 | 4.0 | 14408 | 0.5688 | 0.8335 |
55
 
56
 
57
  ### Framework versions
58
 
59
+ - Transformers 4.27.0
60
  - Pytorch 1.13.1+cu116
61
  - Datasets 2.10.1
62
  - Tokenizers 0.13.2