qwerrwe / src

Commit History

fix for local variable 'LlamaForCausalLM' referenced before assignment
14163c1

winglian commited on

Merge branch 'main' into patch-1
79e2a6f
unverified

Angainor Development commited on

add support to extend context with xpos rope
a03a7d7

winglian commited on

fix for max sequence len across different model types
7f09106

winglian commited on

Fix backward compat for peft
aefb2fc

Nanobit commited on

WIP: Rely on cfg.inference
813cfa4
unverified

Angainor Development commited on

Fix grad checkpoint and outputs param
2a801b0

Nanobit commited on

Fix patching via import instead of hijacking
e44c9e0

Nanobit commited on

Feat: Add landmark attention
55b8542

Nanobit commited on

Disable Wandb
f4df266

Bruno Cabral commited on

Refactor out unmodified save_steps and eval_steps
2ef4634

Nanobit commited on

Set to use cfg.seed or 42 for backward compat
2cfe9e9

Nanobit commited on

Fix failing test
bfd27ba

Nanobit commited on

Validate falcon with fsdp
babf0fd

Nanobit commited on

Fix future deprecate prepare_model_for_int8_training
df9528f

Nanobit commited on

Fix training over existing lora
193c73b
unverified

Angainor Development commited on

fix camel ai, add guanaco/oasst mapping for sharegpt
59bb219

winglian commited on

new prompters, misc fixes for output dir missing using fsdp, and changing max seq len
4ac9e25

winglian commited on

Update doc for grad_accu and add validation tests for batch size
3c71c8d

Nanobit commited on

fix batch size calculation
5a631b3

winglian commited on

fix packing so that concatenated sequences reset the attention
9b8585d

winglian commited on

Merge pull request #124 from OpenAccess-AI-Collective/xformers-fix
2d0ba3b
unverified

winglian commited on

Merge pull request #120 from OpenAccess-AI-Collective/model-from-path
c7021e1
unverified

winglian commited on

don't worry about dupes
c56818b

winglian commited on

Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
1076bcb
unverified

winglian Nanobit commited on

Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
2daa683
unverified

winglian Nanobit commited on

remove unused import and update readme
e3c494c

winglian commited on

black formatting
ad0ea6a

winglian commited on

copy xformers attn from ooba since we removed dep on alpaca_lora_4bit
6cb2310

winglian commited on

add support for gradient accumulation steps
3aad5f3

winglian commited on

fix up tokenizer config, isort fix
39a208c

winglian commited on

split up llama model loading so config can be loaded from base config and models can be loaded from a path
2520ecd

winglian commited on

Fix incorrect rebase
594e72b

Nanobit commited on

Fix sharegpt prompt
25eeeeb

Nanobit commited on

fix relative path for fixtures
cfcc549

winglian commited on

Fix security issue or ignore false positives
a1f9850

Nanobit commited on

Update src/axolotl/prompt_strategies/alpaca_instruct.py
c17dae6

Nanobit winglian commited on

Apply isort then black
37293dc

Nanobit commited on

Fix mypy typing
e9650d3

Nanobit commited on

Fix unsupported operand type(s) for |
be22551

Nanobit commited on

Black formatting
b832a0a

Nanobit commited on

Refactor duplicate code between Prompter and Pygmalion
8e46c0f

Nanobit commited on

Lint wandb
9c6750a

Nanobit commited on

Lint validation
c2dbf2c

Nanobit commited on

Lint tokenization
e6b57de

Nanobit commited on

Lint schedulers
fe1f4c4

Nanobit commited on

Lint dict
633ff21

Nanobit commited on

Lint prompt_tokenizers
5d86137

Nanobit commited on

Lint pygmalion
01c8a33

Nanobit commited on

Lint creative_acr
1645a4d

Nanobit commited on