Attention mask and position id fixes for packing (#285) 2bb0b78 unverified winglian commited on Aug 12, 2023
Update XFormers Attention Monkeypatch to handle Llama-2 70B (GQA) (#339) 10405b9 unverified ssmi153 commited on Aug 6, 2023
fix sdp attention to use the flash/mem-efficient context manaager a032c9f winglian commited on Jul 20, 2023
Fixed pre-commit problems, fixed small bug in logging_config to handle LOG_LEVEL env var b1f4f7a theobjectivedad commited on Jul 15, 2023
Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py 1076bcb unverified winglian Nanobit commited on May 31, 2023
Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py 2daa683 unverified winglian Nanobit commited on May 31, 2023
copy xformers attn from ooba since we removed dep on alpaca_lora_4bit 6cb2310 winglian commited on May 31, 2023