Attention mask and position id fixes for packing (#285) 2bb0b78 unverified winglian commited on Aug 12, 2023
Fixed pre-commit problems, fixed small bug in logging_config to handle LOG_LEVEL env var b1f4f7a theobjectivedad commited on Jul 15, 2023
Merge pull request #92 from OpenAccess-AI-Collective/flash-optimum 16bb627 unverified winglian commited on Jun 14, 2023
Merge pull request #177 from NanoCode012/fix/landmark-patch 8002ffb unverified winglian commited on Jun 12, 2023
add flash attn context for efficient training and attempt setting model to train mode: 8792199 winglian commited on May 27, 2023
Remove explicit definition of cfg.inference c250898 unverified Angainor Development commited on Jun 10, 2023
new prompters, misc fixes for output dir missing using fsdp, and changing max seq len 4ac9e25 winglian commited on Jun 6, 2023
Merge pull request #119 from NanoCode012/feat/update-inference fac4600 unverified Nanobit commited on May 31, 2023
Merge pull request #120 from OpenAccess-AI-Collective/model-from-path c7021e1 unverified winglian commited on May 31, 2023
Merge pull request #108 from OpenAccess-AI-Collective/docker-gptq bbc5bc5 unverified winglian commited on May 30, 2023