jimboHsueh commited on
Commit
d85a7bc
1 Parent(s): 32f2cac

End of training

Browse files
README.md CHANGED
@@ -350,4 +350,23 @@ The following `bitsandbytes` quantization config was used during training:
350
  ### Framework versions
351
 
352
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
353
  - PEFT 0.6.2
 
350
  ### Framework versions
351
 
352
 
353
+ - PEFT 0.6.2
354
+ ## Training procedure
355
+
356
+
357
+ The following `bitsandbytes` quantization config was used during training:
358
+ - quant_method: bitsandbytes
359
+ - load_in_8bit: False
360
+ - load_in_4bit: True
361
+ - llm_int8_threshold: 6.0
362
+ - llm_int8_skip_modules: None
363
+ - llm_int8_enable_fp32_cpu_offload: False
364
+ - llm_int8_has_fp16_weight: False
365
+ - bnb_4bit_quant_type: nf4
366
+ - bnb_4bit_use_double_quant: True
367
+ - bnb_4bit_compute_dtype: bfloat16
368
+
369
+ ### Framework versions
370
+
371
+
372
  - PEFT 0.6.2
adapter_config.json CHANGED
@@ -16,8 +16,8 @@
16
  "rank_pattern": {},
17
  "revision": null,
18
  "target_modules": [
19
- "v_proj",
20
- "q_proj"
21
  ],
22
  "task_type": "CAUSAL_LM"
23
  }
 
16
  "rank_pattern": {},
17
  "revision": null,
18
  "target_modules": [
19
+ "q_proj",
20
+ "v_proj"
21
  ],
22
  "task_type": "CAUSAL_LM"
23
  }
adapter_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5a0ee2ee379b85e0ed78c63e679f47657ed3e94bcfbbf56248630f73ffe7e07e
3
  size 67155338
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eb05cf8e488844569978fc86490caa6eec712591941924aeae1f4f7295572eb9
3
  size 67155338
runs/Nov23_17-03-34_70e1776c12d2/events.out.tfevents.1700759014.70e1776c12d2.66554.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:377ed893cb6838b7c39a0003c8dd08691296a850d7bf9bbd172d23d8acaf11bf
3
+ size 35939