add gptneox embeddings, fix phi2 inputs, also fix the casting (#1083) 78c5b19 unverified winglian commited on Jan 11, 2024
Remove fused-dense-lib from requirements.txt (#1087) 91502b9 unverified casperhansen commited on Jan 10, 2024
add python 3.11 to the matrix for unit tests (#1085) [skip ci] 6c19e93 unverified winglian commited on Jan 10, 2024
optimize calculation of cu_seqlens from position_ids (#1084) [skip ci] 90036eb unverified winglian commited on Jan 10, 2024
use tags again for test image, only run docker e2e after pre-commit checks (#1081) 9032e61 unverified winglian commited on Jan 10, 2024
fix: warn user to install mamba_ssm package (#1019) d69ba2b unverified Nanobit commited on Jan 10, 2024
additional logging to get maximum token length of a sequence in the dataset (#1066) [skip ci] 2f2582e unverified winglian commited on Jan 10, 2024
update sharegpt conversations when chatml chat template is set (#1075) [skip ci] 0ce1a65 unverified winglian commited on Jan 10, 2024
fix: `train_on_inputs: true` ignored for sharegpt (#1045) [skip ci] 043c386 unverified Nanobit winglian commited on Jan 10, 2024
be more robust about checking embedding modules for lora finetunes (#1074) [skip ci] 0f10080 unverified winglian commited on Jan 10, 2024
swap the data collator for evals if not using sample packing (#1076) ead34c5 unverified winglian commited on Jan 10, 2024
Update FUNDING.yml with bitcoin (#1079) [skip ci] 3b4c646 unverified winglian commited on Jan 10, 2024
attempt to also run e2e tests that needs gpus (#1070) 788649f unverified winglian commited on Jan 10, 2024
Separate AutoGPTQ dep to `pip install -e .[auto-gptq]` (#1077) 9be92d1 unverified casperhansen commited on Jan 9, 2024
Add: mlflow for experiment tracking (#1059) [skip ci] 090c24d unverified Johan Hansson winglian commited on Jan 9, 2024
fix double eos token for chatml (#1054) [skip ci] 651b7a3 unverified winglian commited on Jan 9, 2024
Cosine learning rate schedule - minimum learning rate (#1062) 04b978b unverified ricdomolm winglian commited on Jan 9, 2024
Efficiently get the length of the tokenized docs (#1063) 81d3845 unverified ricdomolm winglian commited on Jan 8, 2024
Simplify Docker Unit Test CI (#1055) [skip ci] 9ca358b unverified hamel winglian commited on Jan 6, 2024
streaming multipack for pretraining dataset (#959) 553c80f unverified jinwonkim93 jinwonkim93@github.com winglian commited on Jan 6, 2024
feat: always push checkpoint to hub if set (#1049) [skip ci] cbdbf9e unverified Nanobit commited on Jan 5, 2024
feature: better device mapping for large models (#918) bdfefaf unverified kallewoof Karl-Johan Alm winglian commited on Jan 5, 2024
Added chatglm3 conversation type for training models like TinyLLama (#1036) 59b2d30 unverified xaviviro commited on Jan 4, 2024
bump transformers and update attention class map name (#1023) bcc78d8 unverified winglian commited on Jan 3, 2024
[Docs] delete unused cfg value `lora_out_dir` (#1029) a3e8783 unverified hamel Nanobit commited on Jan 3, 2024
chore(readme): update instruction to set config to load from cache (#1030) b31038a unverified Nanobit commited on Jan 3, 2024
added tiny llama examples for lora and qlora (#1027) c75f916 unverified Tim Dolan commited on Jan 3, 2024
use recommended setting for use_reentrant w gradient checkpointing (#1021) 4d2e842 unverified winglian commited on Jan 2, 2024
Fix: bf16 support for inference (#981) 3678a6c unverified Tazik Shahjahan winglian commited on Dec 29, 2023
[WandB] Push axolotl config to top level wandb files (#1014) 4f4d638 unverified hamel commited on Dec 29, 2023