Collections
Discover the best community collections!
Collections including paper arxiv:2307.09288
-
One Wide Feedforward is All You Need
Paper β’ 2309.01826 β’ Published β’ 31 -
Gated recurrent neural networks discover attention
Paper β’ 2309.01775 β’ Published β’ 6 -
FLM-101B: An Open LLM and How to Train It with $100K Budget
Paper β’ 2309.03852 β’ Published β’ 42 -
Large Language Models as Optimizers
Paper β’ 2309.03409 β’ Published β’ 72
-
Attention Is All You Need
Paper β’ 1706.03762 β’ Published β’ 36 -
Language Models are Few-Shot Learners
Paper β’ 2005.14165 β’ Published β’ 10 -
Learning to summarize from human feedback
Paper β’ 2009.01325 β’ Published β’ 2 -
Training language models to follow instructions with human feedback
Paper β’ 2203.02155 β’ Published β’ 11
-
Llama 2: Open Foundation and Fine-Tuned Chat Models
Paper β’ 2307.09288 β’ Published β’ 235 -
Large-Scale Automatic Audiobook Creation
Paper β’ 2309.03926 β’ Published β’ 52 -
From Sparse to Dense: GPT-4 Summarization with Chain of Density Prompting
Paper β’ 2309.04269 β’ Published β’ 29 -
Textbooks Are All You Need II: phi-1.5 technical report
Paper β’ 2309.05463 β’ Published β’ 84