jina-bert-flash-implementation / modeling_lora.py

Commit History

fix: use staticmethod istead of classmethod
6f8e8c5

Markus28 commited on

feat: added comment
d9a681b

Markus28 commited on

feat: added docstrings
dae5c58

Markus28 commited on

feat: Allow LoRA to be merged into weights (#12)
702e6c9
verified

Markus28 commited on

fix: fixed from_bert method
151f328

Markus28 commited on

fix: fix LoRA implementation
20706dd

Markus28 commited on

feat: only apply select_task_for_layer if task has changed
462e28d

Markus28 commited on

feat: make num of loras part of the config
a416a9d

Markus28 commited on

feat: make main parameters trainable
cdf5490

Markus28 commited on

fix: fixed syntax error in LoRA
e93b0fd

Markus28 commited on

feat: add current_task to forward
9410275

Markus28 commited on

feat: use property in LoRA parametrization
0ff7c3d

Markus28 commited on

feat: added LoRA copyright notice
faa9951

Markus28 commited on

feat: use property instead of setter
6aad619

Markus28 commited on

feat: return from_bert for from_pretrained
5549314

Markus28 commited on

feat: made from_bert work
851184a

Markus28 commited on

feat: select first LoRA upon initialization
fabeb13

Markus28 commited on

feat: formatting and type hints
617fe56

Markus28 commited on

fix: use proper initilization for embedding layer
850b9a2

Markus28 commited on

fix: fixed typo
5c4e4bf

Markus28 commited on

feat: added LoRA
8561a1f

Markus28 commited on