chore(root): Updates files to internal transformers implementation. 8a2c68b gugarosa commited on Jan 5
Disables inference API to prevent mismatch with HF implementation. e8a38cd gugarosa commited on Dec 13, 2023
fix(modeling_phi): Fixes initial generation with length larger than context length. f4e55a8 gugarosa commited on Dec 8, 2023
fix(modeling_phi): Fixes cached generation when above maximum context length. ecfe56e gugarosa commited on Dec 5, 2023
Fixes exceeding maximum sequence length when using generate(). 759d148 gugarosa commited on Nov 20, 2023
Fixes any potential overflow when calculating attention weights. b5c5161 gugarosa commited on Nov 16, 2023
Adds support for flash-attn rotary embedding and fused dense layers. 90c38d9 gugarosa commited on Nov 1, 2023
Adds support for MQA/GQA and attention mask during training / fine-tuning. 371fd51 gugarosa commited on Oct 30, 2023
fix(phi-1): Checks length of `attention_mask`if it is passed as direct tensor. 1f890f7 gugarosa commited on Sep 26, 2023