Commit History

feat: set self.dropout_p in constructor
02ebe52

Markus28 commited on

fix: use attention dropout with torch SDPA implementation
5bc2987

Markus28 commited on

Allow device auto map (#8)
c41d17d

Jackmin108 commited on

Truncate to 8k by default (#5)
43f3955

Jackmin108 commited on

Set max length to 2B
619ca8d

Jackmin108 commited on

Update README.md
a9db862

Jackmin108 commited on

Update README.md
f838124

Jackmin108 commited on

Allow pytorch<2 to use without passing attn_implementation flag (#4)
b5794c5

Jackmin108 commited on

chore: update from afe81ca705ca1a5bd6b7d90548fcac068850b2af
344bcbc

Team Finetuner commited on

Remove triton flash implementation
5ee2c37

Jackmin108 commited on

Delete flash_attn_triton.py
4fa2261

Jackmin108 commited on

chore: update from 896c12d73073854c513200fb74a4887cf25b2b97
96e9a75

Team Finetuner commited on

chore: update from f36c08c8a58c21b5aaab523fa03fb4a24b475612
3e3ced0

Team Finetuner commited on

chore: update from 07ce15d58b77559fce77ea89e92d398f28663bd9
0f4070e

Team Finetuner commited on

feat: allow changing flash implementation
43b8513

Jackmin801 commited on

allow math kernel
bc43a5e

Jackmin801 commited on

Flash attention! (#2)
df1a7f6

Jackmin108 commited on

Create README.md
eb9e889

Jackmin108 commited on

Allow flash attn (#1)
622abd4

Jackmin108 commited on

rename to jina bert in configuration file
33026dc

alaeddine-13 commited on

rename to jina bert
b4f2b16

alaeddine-13 commited on

add basic configuration and model file
e36c994

alaeddine-13 commited on