deuswoof commited on
Commit
854bf88
1 Parent(s): bef0f48

Training in progress, step 10

Browse files
25_10_23_config_test_1.csv CHANGED
@@ -1,5 +1,5 @@
1
  run_number,comment,peformed_already,num_train_epochs,max_tokens,temperature,stop_token,classification_of_valuems,stemming,lemmatization
2
- 1,no variations,False,2,100,0.8,False,False,False,False
3
  2,lemmatization set True,False,2,100,0.8,False,False,False,True
4
  3,stemming set True,False,2,100,0.8,False,False,True,False
5
  4,classification_of_valuems set True,False,2,100,0.8,False,True,False,False
 
1
  run_number,comment,peformed_already,num_train_epochs,max_tokens,temperature,stop_token,classification_of_valuems,stemming,lemmatization
2
+ 1,no variations,True,2,100,0.8,False,False,False,False
3
  2,lemmatization set True,False,2,100,0.8,False,False,False,True
4
  3,stemming set True,False,2,100,0.8,False,False,True,False
5
  4,classification_of_valuems set True,False,2,100,0.8,False,True,False,False
25_10_23_results_real.csv CHANGED
@@ -1,4 +1,7 @@
1
  run_number,items_per_minute,changed_settings,total_time_taken,rouge_scores_unnest,rouge1 low Precision,rouge1 low Recall,rouge1 low F1 Score,rouge1 mid Precision,rouge1 mid Recall,rouge1 mid F1 Score,rouge1 high Precision,rouge1 high Recall,rouge1 high F1 Score,rouge2 low Precision,rouge2 low Recall,rouge2 low F1 Score,rouge2 mid Precision,rouge2 mid Recall,rouge2 mid F1 Score,rouge2 high Precision,rouge2 high Recall,rouge2 high F1 Score
2
- 1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
3
- 2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
4
- 3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
 
 
 
 
1
  run_number,items_per_minute,changed_settings,total_time_taken,rouge_scores_unnest,rouge1 low Precision,rouge1 low Recall,rouge1 low F1 Score,rouge1 mid Precision,rouge1 mid Recall,rouge1 mid F1 Score,rouge1 high Precision,rouge1 high Recall,rouge1 high F1 Score,rouge2 low Precision,rouge2 low Recall,rouge2 low F1 Score,rouge2 mid Precision,rouge2 mid Recall,rouge2 mid F1 Score,rouge2 high Precision,rouge2 high Recall,rouge2 high F1 Score
2
+ 1,0.0,0,0.0,0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0
3
+ 2,0.0,0,0.0,0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0
4
+ 3,1524.3874190654303,no variations,17.318432092666626,0,0.1965167868126792,0.2623791264652471,0.2111415367108999,0.2352416782792418,0.3002366535049206,0.2448331845318486,0.2763985291685149,0.3390275015940367,0.2786734116446178,0.0636962956392297,0.0738250145824239,0.0644136194506123,0.0837962566203547,0.0936010812599078,0.0820598185466411,0.1074387971690385,0.1153256107423611,0.1020295456042898
5
+ 4,0.0,0,0.0,0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0
6
+ 5,0.0,0,0.0,0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0
7
+ 6,0.0,0,0.0,0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0
README.md CHANGED
@@ -580,6 +580,18 @@ The following `bitsandbytes` quantization config was used during training:
580
  - bnb_4bit_use_double_quant: True
581
  - bnb_4bit_compute_dtype: bfloat16
582
 
 
 
 
 
 
 
 
 
 
 
 
 
583
  The following `bitsandbytes` quantization config was used during training:
584
  - quant_method: bitsandbytes
585
  - load_in_8bit: False
@@ -641,5 +653,6 @@ The following `bitsandbytes` quantization config was used during training:
641
  - PEFT 0.5.0
642
  - PEFT 0.5.0
643
  - PEFT 0.5.0
 
644
 
645
  - PEFT 0.5.0
 
580
  - bnb_4bit_use_double_quant: True
581
  - bnb_4bit_compute_dtype: bfloat16
582
 
583
+ The following `bitsandbytes` quantization config was used during training:
584
+ - quant_method: bitsandbytes
585
+ - load_in_8bit: False
586
+ - load_in_4bit: True
587
+ - llm_int8_threshold: 6.0
588
+ - llm_int8_skip_modules: None
589
+ - llm_int8_enable_fp32_cpu_offload: False
590
+ - llm_int8_has_fp16_weight: False
591
+ - bnb_4bit_quant_type: nf4
592
+ - bnb_4bit_use_double_quant: True
593
+ - bnb_4bit_compute_dtype: bfloat16
594
+
595
  The following `bitsandbytes` quantization config was used during training:
596
  - quant_method: bitsandbytes
597
  - load_in_8bit: False
 
653
  - PEFT 0.5.0
654
  - PEFT 0.5.0
655
  - PEFT 0.5.0
656
+ - PEFT 0.5.0
657
 
658
  - PEFT 0.5.0
adapter_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:636f75fb8038f8363096d6919fdba754c7ac769da0feb207b620c449d1e1ec4d
3
  size 100733709
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:34e657b44b96d14e2ce5b3e52196c8d5247583c616958a019d234899b9baa664
3
  size 100733709
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:01c5648b80a562d64438773d6fbda607590b45c94580945fc905c362417f0526
3
  size 100690288
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5b3e7affe50204df0a1b08e6de3bc932e17c3ca1f57a3cfcfb84937c7c261ef4
3
  size 100690288
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7dbad175fd57d97a6e4595cdeff6bd06e3ba283e8fb58c53688360ffc44fddf1
3
  size 4283
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2d9c878f6e25d425273ad506878908a2fc27ec29922d6df8dc0ca2ddc0ceaa53
3
  size 4283