Commit History
Fix and document test_datasets (#1228)
5787e1a
unverified
Fix typo (#1231) [skip ci]
8608d80
unverified
xhedit
commited on
Peft lotfq (#1222)
4cb7900
unverified
winglian
commited on
FEAT: add tagging support to axolotl for DPOTrainer (#1209)
18f8119
unverified
Revert "run PR e2e docker CI tests in Modal" (#1220) [skip ci]
8da1633
unverified
winglian
commited on
run PR e2e docker CI tests in Modal (#1217) [skip ci]
36d053f
unverified
winglian
commited on
ADD: warning if hub_model_id ist set but not any save strategy (#1202)
af29d81
unverified
Respect sliding_window=None (#1214)
62ca4a2
unverified
DreamGenX
commited on
more checks and fixes for deepspeed and fsdp (#1208) [skip ci]
e923e62
unverified
winglian
commited on
workaround for transformers bug requireing do_sample for saveing pretrained (#1206)
ba944e6
unverified
winglian
commited on
make sure to register the base chatml template even if no system message is provided (#1207)
badda37
unverified
winglian
commited on
precompute dpo logprobs setting and fixes (#1199) [skip ci]
33e1170
unverified
winglian
commited on
Feat/chatml add system message (#1117)
98b4762
unverified
fix(log): improve warning to clarify that lora_modules_to_save expect a list (#1197)
08719b9
unverified
Nanobit
commited on
Mixtral fixes 20240124 (#1192) [skip ci]
54d2ac1
unverified
winglian
commited on
Standardize system prompt format for AlpacaPrompter (#1190) [skip ci]
af02430
unverified
Oleh Kuznetsov
commited on
more dpo fixes for dataset loading and docs (#1185) [skip ci]
5bce45f
unverified
winglian
commited on
report min lenght of tokenized data (#1186) [skip ci]
d85d494
unverified
winglian
commited on
Fix generation_config validation raises Exception for do_merge_lora (#1184)
02f2c72
unverified
tisorlawan
commited on
Add support for offline mode with HF_HUB_OFFLINE envvar (#1182)
71141de
unverified
DPO fixes v2 (#1174)
59a31fe
unverified
winglian
commited on
Phi2 multipack (#1173)
814aee6
unverified
winglian
commited on
don't fail if can't cast weights due to offload when merging (#1172) [skip ci]
fb7f9b9
unverified
winglian
commited on
Add desc to map/filter (#1162)
6840381
unverified
support for explicit test_dataset definition for evals (#786)
cda52dc
unverified
winglian
commited on
Falcon embeddings (#1149) [skip docker]
e799e08
unverified
winglian
commited on
Vram fix attempt (#1164) [skip ci]
32580c1
unverified
winglian
commited on
improve vram use w gradient checkpointing (#1167) [skip ci]
802f966
unverified
winglian
commited on
Add mlflow callback for pushing config to mlflow artifacts (#1125)
b8e5603
unverified
JohanWork
commited on
jupyter lab fixes (#1139) [skip ci]
eaaeefc
unverified
winglian
commited on
Qwen2 (#1166)
f5a828a
unverified
winglian
commited on
make sure the model config loader respects the model_revision too (#1160) [skip-ci]
fccb542
unverified
winglian
commited on
Deprecate max packed sequence len (#1141)
2ce5c0d
unverified
winglian
commited on
feat(dataset): add config to keep processed dataset in memory (#1152)
3db5f2f
unverified
Nanobit
commited on
Multipack simplify for Mixtral (#1142)
6910e6a
unverified
winglian
commited on
fix bf16 check when preprocessing data (#1140)
317fa25
unverified
winglian
commited on
fix(preprocess): Make sure dataset not loaded from cache when using preprocess cli (#1136)
1e56b88
unverified
Nanobit
commited on
Preprocess dataset size fix (#1131)
7570446
unverified
winglian
commited on
Add `layers_to_transform` for `lora_config` (#1118)
8487b97
unverified
xzuyn
commited on
Enable or disable bf16 support based on availability (#1116)
0865613
unverified
Simon Hällqvist
commited on
Reverse caching PR (#1115)
2202a20
unverified
casperhansen
commited on
Disable caching on `--disable_caching` in CLI (#1110)
d66b101
unverified
keep gate in fp32 for 16 bit loras (#1105)
da97285
unverified
winglian
commited on
feat: enable trl's autounwrap (#1060)
b432889
unverified
Nanobit
commited on
add gptneox embeddings, fix phi2 inputs, also fix the casting (#1083)
78c5b19
unverified
winglian
commited on