versae commited on
Commit
57383dc
1 Parent(s): fef47d8

Model save

Browse files
README.md ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ library_name: peft
4
+ tags:
5
+ - generated_from_trainer
6
+ base_model: bertin-project/bertin-gpt-j-6B
7
+ model-index:
8
+ - name: bertin-gpt-j-6B_16bit_13
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # bertin-gpt-j-6B_16bit_13
16
+
17
+ This model is a fine-tuned version of [bertin-project/bertin-gpt-j-6B](https://huggingface.co/bertin-project/bertin-gpt-j-6B) on an unknown dataset.
18
+
19
+ ## Model description
20
+
21
+ More information needed
22
+
23
+ ## Intended uses & limitations
24
+
25
+ More information needed
26
+
27
+ ## Training and evaluation data
28
+
29
+ More information needed
30
+
31
+ ## Training procedure
32
+
33
+ ### Training hyperparameters
34
+
35
+ The following hyperparameters were used during training:
36
+ - learning_rate: 1.41e-05
37
+ - train_batch_size: 4
38
+ - eval_batch_size: 8
39
+ - seed: 42
40
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
+ - lr_scheduler_type: linear
42
+ - num_epochs: 3
43
+ - mixed_precision_training: Native AMP
44
+
45
+ ### Training results
46
+
47
+
48
+
49
+ ### Framework versions
50
+
51
+ - PEFT 0.7.1
52
+ - Transformers 4.37.2
53
+ - Pytorch 2.2.0+cu121
54
+ - Datasets 2.14.6
55
+ - Tokenizers 0.15.1
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b5934415b0c4b180e17f250643c9396f07befdce66be8fb72b92fc5df9b26de5
3
  size 29374712
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:79a0e02f55b6c14d38df430e0b375529e6e25255cc274c6193a7a50ce1eb34ff
3
  size 29374712
emissions.csv ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ timestamp,project_name,run_id,duration,emissions,emissions_rate,cpu_power,gpu_power,ram_power,cpu_energy,gpu_energy,ram_energy,energy_consumed,country_name,country_iso_code,region,cloud_provider,cloud_region,os,python_version,codecarbon_version,cpu_count,cpu_model,gpu_count,gpu_model,longitude,latitude,ram_total_size,tracking_mode,on_cloud,pue
2
+ 2024-02-08T15:35:05,codecarbon,a11a3335-3fe3-4380-b239-2f4a6211b54a,14175.276561260223,0.08591174606498499,6.060675126422167e-06,42.5,412.338,377.07379245758057,0.16734638824065512,1.591781718034136,1.484170850461399,3.2432989567361914,Norway,NOR,,,,Linux-6.5.0-15-generic-x86_64-with-glibc2.35,3.10.9,2.2.3,96,Intel(R) Xeon(R) Gold 6248R CPU @ 3.00GHz,2,2 x NVIDIA RTX A6000,10.7559,59.9452,1005.5301132202148,machine,N,1.0
runs/Feb08_11-38-20_dante/events.out.tfevents.1707388707.dante.4059757.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d4a664bd2f4e68af6234152f2de56b2cf1054a8283b72f107b1ce742ce01f7f2
3
- size 318372
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:89402dd01b96f5caa7dd1e8c40065c2a47d359ac9d0d4726bb98e7991a91b19a
3
+ size 342433