albertmartinez commited on
Commit
b9e9bec
·
verified ·
1 Parent(s): 38f930b

Model save

Browse files
Files changed (1) hide show
  1. README.md +43 -29
README.md CHANGED
@@ -1,39 +1,53 @@
1
  ---
2
  license: mit
3
- metrics:
4
- - accuracy
5
- - precision
6
- - recall
7
- - f1
8
- datasets:
9
- - albertmartinez/OSDG
10
- pipeline_tag: text-classification
11
- widget:
12
- - text: "Between the Social and the Spatial - Exploring Multiple Dimensions of Poverty and Social Exclusion, Ashgate. Poverty in Europe and the USA, Exchanging Official Measurement Methods”, Maastricht Graduate School of Governance Working Paper 2007/005. Monitoring Absolute and Relative Poverty, ‘Not Enough’ Is Not the Same as ‘Much Less’”, Review of Income and Wealth, 57(2), 247-269. Poverty and Social Exclusion in Britain, The Policy Press, Bristol."
13
- - text: "A circular economy is a way of achieving sustainable consumption and production, as well as nature positive outcomes."
14
  ---
15
 
16
- # albertmartinez/bert-sdg-classification
 
17
 
18
- This model (BERT) is for classifying text with respect to the United Nations sustainable development goals (SDG).
19
 
20
- ## Training Hyperparameters
21
 
22
- - Num_epoch = 10
23
- - Learning rate = 5e-5
24
- - Batch size = 16
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25
 
26
  ### Training results
27
 
28
- | epoch | eval_loss | eval_accuracy | eval_precision | eval_recall | eval_f1 |
29
- |:-----:|:------------------:|:------------------:|:------------------:|:------------------:|:------------------:|
30
- | 1 | 0.8289520740509033 | 0.7644437495113752 | 0.7640369944809821 | 0.7644437495113752 | 0.7554162181382816 |
31
- | 2 | 0.7316043972969055 | 0.792432178875772 | 0.7973419054011932 | 0.792432178875772 | 0.7936523802626467 |
32
- | 3 | 0.7474315762519836 | 0.7989992963802673 | 0.8003484834993271 | 0.7989992963802673 | 0.7980647892639322 |
33
- | 4 | 0.9092283248901367 | 0.8023610351028067 | 0.8028028170382215 | 0.8023610351028067 | 0.8010556735181147 |
34
- | 5 | 1.0973293781280518 | 0.8040028144789305 | 0.806116786873114 | 0.8040028144789305 | 0.8037135940426907 |
35
- | 6 | 1.2260032892227173 | 0.8032210147760144 | 0.8046046540363118 | 0.8032210147760144 | 0.8009496362737498 |
36
- | 7 | 1.3465653657913208 | 0.8082245328746775 | 0.8079189056438383 | 0.8082245328746775 | 0.8070935517356475 |
37
- | 8 | 1.458662509918213 | 0.8132280509733406 | 0.8124031757212116 | 0.8132280509733406 | 0.8124964838774498 |
38
- | 9 | 1.5251907110214233 | 0.8108044718943007 | 0.8112362484949358 | 0.8108044718943007 | 0.8097338645156864 |
39
- | 10 | 1.50314199924469 | 0.8152607302009225 | 0.8143774938584517 | 0.8152607302009225 | 0.8144630791491494 |
 
1
  ---
2
  license: mit
3
+ base_model: albertmartinez/bert-sdg-classification
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: bert-sdg-classification
8
+ results: []
 
 
 
 
 
9
  ---
10
 
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # bert-sdg-classification
15
 
16
+ This model is a fine-tuned version of [albertmartinez/bert-sdg-classification](https://huggingface.co/albertmartinez/bert-sdg-classification) on an unknown dataset.
17
 
18
+ ## Model description
19
+
20
+ More information needed
21
+
22
+ ## Intended uses & limitations
23
+
24
+ More information needed
25
+
26
+ ## Training and evaluation data
27
+
28
+ More information needed
29
+
30
+ ## Training procedure
31
+
32
+ ### Training hyperparameters
33
+
34
+ The following hyperparameters were used during training:
35
+ - learning_rate: 1e-05
36
+ - train_batch_size: 32
37
+ - eval_batch_size: 8
38
+ - seed: 42
39
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
+ - lr_scheduler_type: linear
41
+ - lr_scheduler_warmup_steps: 600
42
+ - num_epochs: 3.0
43
 
44
  ### Training results
45
 
46
+
47
+
48
+ ### Framework versions
49
+
50
+ - Transformers 4.42.4
51
+ - Pytorch 2.3.1+cu121
52
+ - Datasets 2.20.0
53
+ - Tokenizers 0.19.1