Pass weakref to model in the SIGINT handler to free up model post train function (#1581) dde02fc unverified chiragjn winglian commited on May 3
FIX: TRL trainer preprocessing step was running in one process (#1583) b9bb169 unverified Ali Mosavian Ali Mosavian commited on May 3
Add debug option for RL dataset preprocessing (#1404) cc5d31e unverified abhinand Nanobit commited on Apr 30
make sure everything stays in the same dtype when using dpo + FSDP (#1559) 68601ec unverified winglian commited on Apr 22
Add support for Gemma chat template (#1530) 60f5ce0 unverified Haoxiang-Wang winglian commited on Apr 21
wrap prepared_ds_path in str() to avoid TypeError in fsspec package (#1548) 7477a53 unverified Frank Ruis winglian commited on Apr 21
Update SaveAxolotlConfigtoWandBCallback to use artifact instead of save (#1483) 5ed2939 unverified tcapelle winglian commited on Apr 9
use locale agnostic seperator to make large nums easier to read (#1503) da9b1a3 unverified winglian commited on Apr 9
WIP: Support table logging for mlflow, too (#1506) 057fa44 unverified DavidFarago Dave Farago winglian commited on Apr 9
Correctly handle splits for datasets.arrow_dataset.Dataset objects (#1504) 8fa0785 unverified scottifer8 winglian commited on Apr 9
add field to sft dataset pydantic for completion support (#1497) ff01c45 unverified winglian commited on Apr 9
ignore issues with calculating # params when printing (#1493) 2fa65b9 unverified winglian commited on Apr 8
drop empty token from beginning if tokenizer has no bos_token (in the case of qwen) (#1490) 934fc85 unverified winglian commited on Apr 7
feat: validate sample packing requires flash_attention (#1465) bf4cd67 unverified Nanobit commited on Apr 5
refactor utils.data module for line count linter (#1476) e0fcef4 unverified winglian commited on Apr 4
Support loading datasets saved via save_to_disk (#1432) e634118 unverified fozziethebeat commited on Mar 29
support layer replication for peft and fix rslora integration (#1445) 25afd35 unverified winglian commited on Mar 27
fix for accelerate env var for auto bf16, add new base image and expand torch_cuda_arch_list support (#1413) da265dd unverified winglian commited on Mar 26
make sure to capture non-null defaults from config validation (#1415) 601b77b unverified winglian commited on Mar 26
fix(dataset): normalize tokenizer config and change hash from tokenizer class to tokenizer path (#1298) ff939d8 unverified Nanobit commited on Mar 25
strip out hacky qlora-fsdp workarounds now that qlora-fsdp fixes are upstreamed (#1428) 2a1589f unverified winglian commited on Mar 21
HF / FEAT: Optimize HF tags (#1425) [skip ci] 7d55607 unverified Younes Belkada winglian commited on Mar 21
support galore once upstreamed into transformers (#1409) dd449c5 unverified winglian commited on Mar 19
Add a config not to shuffle merged dataset (#1394) [skip ci] 43bdc5d unverified seungduk winglian commited on Mar 19
fix(config): passing gradient_checkpoint_kwargs (#1412) b1e3e1b unverified Nanobit commited on Mar 19