Collections
Discover the best community collections!
Collections including paper arxiv:2402.13598
-
Evaluating Very Long-Term Conversational Memory of LLM Agents
Paper • 2402.17753 • Published • 18 -
StructLM: Towards Building Generalist Models for Structured Knowledge Grounding
Paper • 2402.16671 • Published • 26 -
Do Large Language Models Latently Perform Multi-Hop Reasoning?
Paper • 2402.16837 • Published • 24 -
Divide-or-Conquer? Which Part Should You Distill Your LLM?
Paper • 2402.15000 • Published • 22
-
PALO: A Polyglot Large Multimodal Model for 5B People
Paper • 2402.14818 • Published • 23 -
LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens
Paper • 2402.13753 • Published • 111 -
User-LLM: Efficient LLM Contextualization with User Embeddings
Paper • 2402.13598 • Published • 18 -
Coercing LLMs to do and reveal (almost) anything
Paper • 2402.14020 • Published • 12
-
TofuEval: Evaluating Hallucinations of LLMs on Topic-Focused Dialogue Summarization
Paper • 2402.13249 • Published • 10 -
The FinBen: An Holistic Financial Benchmark for Large Language Models
Paper • 2402.12659 • Published • 16 -
Instruction-tuned Language Models are Better Knowledge Learners
Paper • 2402.12847 • Published • 24 -
Synthetic Data (Almost) from Scratch: Generalized Instruction Tuning for Language Models
Paper • 2402.13064 • Published • 46
-
User-LLM: Efficient LLM Contextualization with User Embeddings
Paper • 2402.13598 • Published • 18 -
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
Paper • 2403.05185 • Published • 20 -
SPAR: Personalized Content-Based Recommendation via Long Engagement Attention
Paper • 2402.10555 • Published • 32
-
User-LLM: Efficient LLM Contextualization with User Embeddings
Paper • 2402.13598 • Published • 18 -
ShortGPT: Layers in Large Language Models are More Redundant Than You Expect
Paper • 2403.03853 • Published • 62 -
From Words to Numbers: Your Large Language Model Is Secretly A Capable Regressor When Given In-Context Examples
Paper • 2404.07544 • Published • 18