Commit History

Adds support for MQA/GQA and attention mask during training / fine-tuning.
371fd51

gugarosa commited on

Upload modeling_mixformer_sequential.py
633bca1

gugarosa commited on

Upload README.md
769684a

gugarosa commited on

fix(phi-1): Checks length of `attention_mask`if it is passed as direct tensor.
1f890f7

gugarosa commited on

Support for `attention_mask` in forward pass.
d22f35e

gugarosa commited on

Upload MixFormerSequentialForCausalLM
44cca9f

suriyagunasekar commited on

Upload MixFormerSequentialForCausalLM
e96b200

suriyagunasekar commited on

Upload MixFormerSequentialForCausalLM
0f4ae0e

suriyagunasekar commited on