josecaloca commited on
Commit
a7fe7bb
·
verified ·
1 Parent(s): 2476496

josecaloca/test-multiclass-text-classification-model

Browse files
Files changed (1) hide show
  1. README.md +0 -29
README.md CHANGED
@@ -4,11 +4,6 @@ license: apache-2.0
4
  base_model: distilbert/distilbert-base-uncased
5
  tags:
6
  - generated_from_trainer
7
- metrics:
8
- - accuracy
9
- - precision
10
- - recall
11
- - f1
12
  model-index:
13
  - name: multiclass-text-classification
14
  results: []
@@ -20,12 +15,6 @@ should probably proofread and complete it, then remove this comment. -->
20
  # multiclass-text-classification
21
 
22
  This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
23
- It achieves the following results on the evaluation set:
24
- - Loss: 0.3907
25
- - Accuracy: 0.8697
26
- - Precision: 0.8597
27
- - Recall: 0.8582
28
- - F1: 0.8584
29
 
30
  ## Model description
31
 
@@ -54,24 +43,6 @@ The following hyperparameters were used during training:
54
 
55
  ### Training results
56
 
57
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
58
- |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
59
- | No log | 0.0591 | 25 | 0.8338 | 0.6769 | 0.8050 | 0.5512 | 0.5219 |
60
- | No log | 0.1182 | 50 | 0.6563 | 0.7864 | 0.8163 | 0.7376 | 0.7587 |
61
- | No log | 0.1773 | 75 | 0.5698 | 0.8047 | 0.8144 | 0.7808 | 0.7895 |
62
- | No log | 0.2364 | 100 | 0.5319 | 0.8169 | 0.7983 | 0.8122 | 0.8033 |
63
- | No log | 0.2955 | 125 | 0.5371 | 0.8162 | 0.8330 | 0.7826 | 0.7981 |
64
- | No log | 0.3546 | 150 | 0.4874 | 0.8381 | 0.8392 | 0.8158 | 0.8242 |
65
- | No log | 0.4137 | 175 | 0.4418 | 0.8541 | 0.8456 | 0.8390 | 0.8421 |
66
- | No log | 0.4728 | 200 | 0.4616 | 0.8512 | 0.8544 | 0.8246 | 0.8371 |
67
- | No log | 0.5319 | 225 | 0.4434 | 0.8553 | 0.8489 | 0.8413 | 0.8446 |
68
- | No log | 0.5910 | 250 | 0.4323 | 0.8561 | 0.8498 | 0.8434 | 0.8459 |
69
- | No log | 0.6501 | 275 | 0.4430 | 0.8537 | 0.8606 | 0.8315 | 0.8421 |
70
- | No log | 0.7092 | 300 | 0.4421 | 0.8536 | 0.8435 | 0.8460 | 0.8413 |
71
- | No log | 0.7683 | 325 | 0.4220 | 0.8586 | 0.8487 | 0.8500 | 0.8488 |
72
- | No log | 0.8274 | 350 | 0.3888 | 0.8693 | 0.8585 | 0.8570 | 0.8577 |
73
- | No log | 0.8865 | 375 | 0.3930 | 0.8678 | 0.8571 | 0.8576 | 0.8566 |
74
- | No log | 0.9456 | 400 | 0.3907 | 0.8697 | 0.8597 | 0.8582 | 0.8584 |
75
 
76
 
77
  ### Framework versions
 
4
  base_model: distilbert/distilbert-base-uncased
5
  tags:
6
  - generated_from_trainer
 
 
 
 
 
7
  model-index:
8
  - name: multiclass-text-classification
9
  results: []
 
15
  # multiclass-text-classification
16
 
17
  This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on an unknown dataset.
 
 
 
 
 
 
18
 
19
  ## Model description
20
 
 
43
 
44
  ### Training results
45
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
46
 
47
 
48
  ### Framework versions