qwerrwe / tests /test_validation.py

Commit History

ADD: warning hub model (#1301)
601c08b
unverified

JohanWork Nanobit commited on

feat: validate sample packing requires flash_attention (#1465)
bf4cd67
unverified

Nanobit commited on

make sure to capture non-null defaults from config validation (#1415)
601b77b
unverified

winglian commited on

fix for protected model_ namespace w pydantic (#1345)
6b3b271
unverified

winglian commited on

more fixes 20240228 (#1342) [skip ci]
0f985e1
unverified

winglian commited on

Pydantic 2.x cfg (#1239)
cc3cebf
unverified

winglian commited on

Peft lotfq (#1222)
4cb7900
unverified

winglian commited on

ADD: warning if hub_model_id ist set but not any save strategy (#1202)
af29d81
unverified

JohanWork winglian commited on

Phi2 multipack (#1173)
814aee6
unverified

winglian commited on

Deprecate max packed sequence len (#1141)
2ce5c0d
unverified

winglian commited on

Add `layers_to_transform` for `lora_config` (#1118)
8487b97
unverified

xzuyn commited on

add gptneox embeddings, fix phi2 inputs, also fix the casting (#1083)
78c5b19
unverified

winglian commited on

be more robust about checking embedding modules for lora finetunes (#1074) [skip ci]
0f10080
unverified

winglian commited on

attempt to also run e2e tests that needs gpus (#1070)
788649f
unverified

winglian commited on

Feat: Warns to add to modules_to_save when adding tokens or switching special_tokens (#787)
1ffa386
unverified

Nanobit commited on

Feat(wandb): Refactor to be more flexible (#767)
a1da39c
unverified

Nanobit commited on

Feat: Add warmup_ratio (#893)
fb12895
unverified

Nanobit commited on

Fix: Warn when fullfinetune without adapter (#770)
44c9d01
unverified

Nanobit commited on

Fix: eval table conflict with eval_sample_packing (#769)
9923b72
unverified

Nanobit commited on

Fix(cfg): Add validation for save_strategy and eval_strategy (#633)
383f88d
unverified

Nanobit commited on

use fastchat conversations template (#578)
e7d3e2d
unverified

winglian commited on

Fix: Fail bf16 check when running on cpu during merge (#631)
cfbce02
unverified

Nanobit commited on

recommend padding when using sample packing (#531)
3437149
unverified

winglian commited on

extract module for working with cfg
8cec513

tmm1 commited on

Attention mask and position id fixes for packing (#285)
2bb0b78
unverified

winglian commited on

params are adam_*, not adamw_*
19cf0bd

winglian commited on

Additional test case per pr
ad5ca4f

winglian commited on

add validation and tests for adamw hyperparam
cb9d3af

winglian commited on

Merge branch 'main' into flash-optimum
fd2c981
unverified

winglian commited on

new validation for mpt w grad checkpoints
14668fa

winglian commited on

add streaming dataset support for pretraining datasets
eea2731

winglian commited on

Validate falcon with fsdp
babf0fd

Nanobit commited on

Update doc for grad_accu and add validation tests for batch size
3c71c8d

Nanobit commited on

black formatting
6fa40bf

winglian commited on

add support for gradient accumulation steps
3aad5f3

winglian commited on

Apply isort then black
37293dc

Nanobit commited on

Ignore unsupported-binary-operation
0dd35c7

Nanobit commited on

Black formatting
b832a0a

Nanobit commited on

Lint validation
1f3c3f5

Nanobit commited on

update for pr feedback
fd5f965

winglian commited on

new hf_use_auth_token setting so login to hf isn't required
1c33eb8

winglian commited on

Feat: Update validate_config and add tests
52dd92a

Nanobit commited on