Collections
Discover the best community collections!
Collections trending this week
-
Attention Is All You Need
Paper ā¢ 1706.03762 ā¢ Published ā¢ 37 -
FlashAttention-2: Faster Attention with Better Parallelism and Work Partitioning
Paper ā¢ 2307.08691 ā¢ Published ā¢ 6 -
Mixtral of Experts
Paper ā¢ 2401.04088 ā¢ Published ā¢ 154 -
Mistral 7B
Paper ā¢ 2310.06825 ā¢ Published ā¢ 45
-
PockEngine: Sparse and Efficient Fine-tuning in a Pocket
Paper ā¢ 2310.17752 ā¢ Published ā¢ 11 -
S-LoRA: Serving Thousands of Concurrent LoRA Adapters
Paper ā¢ 2311.03285 ā¢ Published ā¢ 27 -
Parameter-Efficient Orthogonal Finetuning via Butterfly Factorization
Paper ā¢ 2311.06243 ā¢ Published ā¢ 17 -
Fine-tuning Language Models for Factuality
Paper ā¢ 2311.08401 ā¢ Published ā¢ 26