fix for local variable 'LlamaForCausalLM' referenced before assignment 14163c1 winglian commited on Jun 10, 2023
new prompters, misc fixes for output dir missing using fsdp, and changing max seq len 4ac9e25 winglian commited on Jun 6, 2023
Update doc for grad_accu and add validation tests for batch size 3c71c8d Nanobit commited on May 31, 2023
fix packing so that concatenated sequences reset the attention 9b8585d winglian commited on May 31, 2023
Merge pull request #124 from OpenAccess-AI-Collective/xformers-fix 2d0ba3b unverified winglian commited on May 31, 2023
Merge pull request #120 from OpenAccess-AI-Collective/model-from-path c7021e1 unverified winglian commited on May 31, 2023
Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py 1076bcb unverified winglian Nanobit commited on May 31, 2023
Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py 2daa683 unverified winglian Nanobit commited on May 31, 2023
copy xformers attn from ooba since we removed dep on alpaca_lora_4bit 6cb2310 winglian commited on May 31, 2023
split up llama model loading so config can be loaded from base config and models can be loaded from a path 2520ecd winglian commited on May 31, 2023
Update src/axolotl/prompt_strategies/alpaca_instruct.py c17dae6 Nanobit winglian commited on May 29, 2023