lgk03 commited on
Commit
a7fec1e
1 Parent(s): 319e0be

End of training

Browse files
Files changed (1) hide show
  1. README.md +7 -6
README.md CHANGED
@@ -20,7 +20,7 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.3354
24
  - Accuracy: 0.8630
25
  - F1: 0.8657
26
  - Precision: 0.8915
@@ -51,18 +51,19 @@ The following hyperparameters were used during training:
51
  - total_train_batch_size: 128
52
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
54
- - num_epochs: 1
55
 
56
  ### Training results
57
 
58
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
59
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
60
- | 0.187 | 1.0 | 672 | 0.3354 | 0.8630 | 0.8657 | 0.8915 | 0.8630 |
 
61
 
62
 
63
  ### Framework versions
64
 
65
- - Transformers 4.40.2
66
- - Pytorch 2.2.1+cu121
67
- - Datasets 2.19.1
68
  - Tokenizers 0.19.1
 
20
 
21
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.3571
24
  - Accuracy: 0.8630
25
  - F1: 0.8657
26
  - Precision: 0.8915
 
51
  - total_train_batch_size: 128
52
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
54
+ - num_epochs: 2
55
 
56
  ### Training results
57
 
58
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
59
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
60
+ | 0.1867 | 1.0 | 672 | 0.3783 | 0.8630 | 0.8657 | 0.8915 | 0.8630 |
61
+ | 0.1441 | 2.0 | 1344 | 0.3571 | 0.8630 | 0.8657 | 0.8915 | 0.8630 |
62
 
63
 
64
  ### Framework versions
65
 
66
+ - Transformers 4.41.2
67
+ - Pytorch 2.3.0+cu121
68
+ - Datasets 2.20.0
69
  - Tokenizers 0.19.1