-
RoFormer: Enhanced Transformer with Rotary Position Embedding
Paper • 2104.09864 • Published • 7 -
Attention Is All You Need
Paper • 1706.03762 • Published • 34 -
LoRA: Low-Rank Adaptation of Large Language Models
Paper • 2106.09685 • Published • 24 -
FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness
Paper • 2205.14135 • Published • 8
Justin PRO
jxtngx
AI & ML interests
None yet
Organizations
Collections
8
models
None public yet
datasets
None public yet