qwerrwe / requirements.txt

Commit History

Separate AutoGPTQ dep to `pip install -e .[auto-gptq]` (#1077)
9be92d1
unverified

casperhansen commited on

paired kto support (#1069)
d7057cc
unverified

winglian commited on

update peft to 0.7.0 (#1073)
768d348
unverified

marktenenholtz commited on

Add: mlflow for experiment tracking (#1059) [skip ci]
090c24d
unverified

Johan Hansson winglian commited on

Phi2 rewrite (#1058)
732851f
unverified

winglian commited on

RL/DPO (#935)
f243c21

winglian commited on

bump transformers and update attention class map name (#1023)
bcc78d8
unverified

winglian commited on

chore: Update transformers to latest (#986)
7d4185f
unverified

Nanobit commited on

update transformers to fix checkpoint saving (#963)
f28e755
unverified

dumpmemory commited on

Mixtral official (#942)
7fabc4d
unverified

winglian commited on

Update requirements.txt (#940)
9a5eb39
unverified

tokestermw commited on

update to latest transformers for mixstral support (#929)
35f9b0f
unverified

winglian commited on

update datasets version to cut down the warnings due to pyarrow arg change (#897)
6a4562a
unverified

winglian commited on

try #2: pin hf transformers and accelerate to latest release, don't reinstall pytorch (#867)
0de1457
unverified

winglian commited on

Feat: Add dataset loading from S3, GCS (#765)
3cc67d2
unverified

Nanobit commited on

add e2e tests for checking functionality of resume from checkpoint (#865)
b3a61e8
unverified

winglian commited on

Pin optimum package (#838)
105d0b3
unverified

Bryan Thornbury commited on

don't compile deepspeed or bitsandbytes from source (#837)
f544ab2
unverified

winglian commited on

Feat: Added Gradio support (#812)
738a057
unverified

stillerman commited on

fix: pin autogptq (#818)
6459ac7
unverified

Nanobit commited on

chore: bump transformers to v4.34.1 to fix tokenizer issue (#745)
8966a6f
unverified

Nanobit commited on

pin xformers >= 0.0.22 (#724)
bfbdba8
unverified

winglian commited on

Fix(version): Update FA to work with Mistral SWA (#673)
43856c0
unverified

Nanobit commited on

Feat: Allow usage of native Mistral FA when no sample_packing (#669)
697c50d
unverified

Nanobit commited on

removed duplicate on requirements.txt (#661)
a7e56d8
unverified

Napuh commited on

add mistral e2e tests (#649)
5b0bc48
unverified

winglian commited on

Mistral flash attn packing (#646)
b6ab8aa
unverified

winglian commited on

use fastchat conversations template (#578)
e7d3e2d
unverified

winglian commited on

Feat: Add support for upstream FA2 (#626)
19a600a
unverified

Nanobit commited on

update README w deepspeed info (#605)
c25ba79
unverified

winglian commited on

Update requirements.txt (#610)
ec0958f
unverified

Javier commited on

fix wandb so mypy doesn't complain (#562)
bf08044
unverified

winglian commited on

Update requirements.txt (#543)
c1921c9
unverified

dongxiaolong commited on

update readme to point to direct link to runpod template, cleanup install instrucitons (#532)
34c0a86
unverified

winglian commited on

Add support for GPTQ using native transformers/peft (#468)
3355706
unverified

winglian commited on

add eval benchmark callback (#441)
7657632
unverified

winglian commited on

customizable ascii art (#506)
548787d
unverified

winglian commited on

Fix missing 'packaging' wheel (#482)
c500d02
unverified

Maxime commited on

allow newer deps
c29117a

tmm1 commited on

flash attn pip install (#426)
cf66547
unverified

mhenrichsen Ubuntu mhenrichsen Mads Henrichsen winglian commited on

remove extra accelearate in requirements (#430)
82e111a
unverified

winglian commited on

Attention mask and position id fixes for packing (#285)
2bb0b78
unverified

winglian commited on

Merge pull request #355 from tmm1/bitsandbytes-fixes
35c8b90
unverified

tmm1 commited on

bump to latest bitsandbytes release with major bug fixes
fce40aa

tmm1 commited on

use newer pynvml package
9c31410

tmm1 commited on

log GPU memory usage
e303d64

tmm1 commited on

pin accelerate so it works with llama2 (#330)
6c9a87c
unverified

winglian commited on

latest HEAD of accelerate causes 0 loss immediately w FSDP (#321)
9f69c4d
unverified

winglian commited on

add hf_transfer to requirements for faster hf upload
6dd2e7d

winglian commited on