TL;DR Make your model write "margin notes" as you chunk prefill the KV cache. Then ask it reread all notes before it speaks up. Works with humans, works with AI π€
WiM leverages the chunked prefill of the key-value cache, which concurrently generates query-based extractive summaries at each step of the prefill that are subsequently reintegrated at the end of the computation. We term these intermediate outputs βmarginsβ, drawing inspiration from the practice of making margin notes for improved comprehension of long contexts in human reading. We show that this technique, which adds only minimal additional computation, significantly improves LLMs long context reasoning capabilities.
Think: Every chunk has a chance to be attended to/ be at the end of the context at least once. π
π Results: - An average accuracy boost of 7.5% in multi-hop reasoning tasks like HotpotQA and MultiHop-RAG. - Even a 30% increase in F1-score for summarisation-like tasks (CWE).
Plus, WiM fits seamlessly into interactive applications (think: progress bar!). It can provide real-time progress updates during data retrieval and integration, making it user-friendly and transparent - a stark contrast to feeding 1mln tokens to an LLMs and waiting 6 min for the first token. π€―
π₯ Today, Writer dropped Palmyra-Med-70b and Palmyra-Fin-70b, two new domain-specific models that are setting a new standard for medical and financial model performance.
TL;DR Palmyra-Med-70b π’ 8k and 32k versions available π MMLU performance of ~86%, outperforming other top models π¨ββοΈ Great for diagnosing, planning treatments, medical research, insurance coding and billing π Open-model license for non-commercial use cases π€ Available on Hugging Face: Writer/Palmyra-Med-70B πΎ Live on NVIDIA NIM: https://build.nvidia.com/writer/palmyra-med-70b
Palmyra-Fin-70b π Passed the CFA Level III exam with a 73% score β the first model to do so πΈ Skilled at complex tasks like investment research, financial analysis, and sentiment analysis π Outperformed other top models on a long-fin-eval test of real-world use cases π Open-model license for non-commercial use cases π€ Available on Hugging Face: Writer/Palmyra-Fin-70B-32K πΎ Live on NVIDIA NIM: https://build.nvidia.com/writer/palmyra-fin-70b-32k