Dauka-transformers commited on
Commit
d472065
·
verified ·
1 Parent(s): 440ea12

End of training

Browse files
README.md ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ model-index:
5
+ - name: interpro_bert
6
+ results: []
7
+ ---
8
+
9
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
+ should probably proofread and complete it, then remove this comment. -->
11
+
12
+ # interpro_bert
13
+
14
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
15
+ It achieves the following results on the evaluation set:
16
+ - Loss: 0.7610
17
+
18
+ ## Model description
19
+
20
+ More information needed
21
+
22
+ ## Intended uses & limitations
23
+
24
+ More information needed
25
+
26
+ ## Training and evaluation data
27
+
28
+ More information needed
29
+
30
+ ## Training procedure
31
+
32
+ ### Training hyperparameters
33
+
34
+ The following hyperparameters were used during training:
35
+ - learning_rate: 2e-05
36
+ - train_batch_size: 256
37
+ - eval_batch_size: 128
38
+ - seed: 42
39
+ - distributed_type: multi-GPU
40
+ - num_devices: 8
41
+ - total_train_batch_size: 2048
42
+ - total_eval_batch_size: 1024
43
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
+ - lr_scheduler_type: linear
45
+ - num_epochs: 15
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss |
50
+ |:-------------:|:-----:|:------:|:---------------:|
51
+ | 4.7929 | 1.0 | 14395 | 4.4007 |
52
+ | 2.6085 | 2.0 | 28790 | 2.4082 |
53
+ | 1.7765 | 3.0 | 43185 | 1.6582 |
54
+ | 1.3909 | 4.0 | 57580 | 1.3030 |
55
+ | 1.1805 | 5.0 | 71975 | 1.1146 |
56
+ | 1.0593 | 6.0 | 86370 | 1.0107 |
57
+ | 0.9726 | 7.0 | 100765 | 0.9359 |
58
+ | 0.9167 | 8.0 | 115160 | 0.8880 |
59
+ | 0.8683 | 9.0 | 129555 | 0.8475 |
60
+ | 0.8436 | 10.0 | 143950 | 0.8224 |
61
+ | 0.8167 | 11.0 | 158345 | 0.7974 |
62
+ | 0.8009 | 12.0 | 172740 | 0.7850 |
63
+ | 0.7874 | 13.0 | 187135 | 0.7682 |
64
+ | 0.7772 | 14.0 | 201530 | 0.7654 |
65
+ | 0.7738 | 15.0 | 215925 | 0.7610 |
66
+
67
+
68
+ ### Framework versions
69
+
70
+ - Transformers 4.39.2
71
+ - Pytorch 2.2.2+cu121
72
+ - Datasets 2.18.0
73
+ - Tokenizers 0.15.2
emissions.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ timestamp,project_name,run_id,duration,emissions,emissions_rate,cpu_power,gpu_power,ram_power,cpu_energy,gpu_energy,ram_energy,energy_consumed,country_name,country_iso_code,region,cloud_provider,cloud_region,os,python_version,codecarbon_version,cpu_count,cpu_model,gpu_count,gpu_model,longitude,latitude,ram_total_size,tracking_mode,on_cloud,pue
2
+ 2024-04-16T16:04:28,codecarbon,95f18540-a607-4d10-861d-d3f723a7f6a0,94875.835136652,5.398440238205457,5.6900055008000204e-05,82.5,0.0,282.9608030319214,2.17365881507198,0.0,7.275142307569442,9.448801122641417,Saudi Arabia,SAU,al-qassim region,,,Linux-5.14.0-162.23.1.el9_1.x86_64-x86_64-with-glibc2.34,3.9.13,2.2.3,24,Intel(R) Xeon(R) Platinum 8260 CPU @ 2.40GHz,8,8 x Tesla V100-SXM2-32GB,42.3813,26.0116,754.562141418457,machine,N,1.0
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 0,
4
+ "eos_token_id": 2,
5
+ "pad_token_id": 0,
6
+ "transformers_version": "4.39.2"
7
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:988ee5af290e6c08ce6c2bb94bf86df0afe7c85c4adc05eca1ae26a195b1913c
3
  size 1322166728
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ca79060a71089c677c3f20567faadbcec11b003af744595bd4da9f7d9dee8e98
3
  size 1322166728
runs/Apr15_13-24-52_gpu212-14/events.out.tfevents.1713177778.gpu212-14.780132.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:582034ca3caa7958a6401c47a286adc2bfee745841d5524c9950077e91fb9d71
3
- size 100924
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:51b81033cde628b3a5234093149b645f6e342016cc80901dbfd71ecae2aa644c
3
+ size 101560