Collections
Discover the best community collections!
Collections including paper arxiv:2311.16079
-
Attention Is All You Need
Paper β’ 1706.03762 β’ Published β’ 50 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper β’ 1810.04805 β’ Published β’ 16 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper β’ 1907.11692 β’ Published β’ 7 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper β’ 1910.01108 β’ Published β’ 14
-
Multimodal Contrastive Representation Learning in Augmented Biomedical Knowledge Graphs
Paper β’ 2501.01644 β’ Published β’ 1 -
Understanding the Impact of Confidence in Retrieval Augmented Generation: A Case Study in the Medical Domain
Paper β’ 2412.20309 β’ Published -
On the Compositional Generalization of Multimodal LLMs for Medical Imaging
Paper β’ 2412.20070 β’ Published β’ 43 -
MEDEC: A Benchmark for Medical Error Detection and Correction in Clinical Notes
Paper β’ 2412.19260 β’ Published β’ 1
-
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits
Paper β’ 2402.17764 β’ Published β’ 606 -
Mixtral of Experts
Paper β’ 2401.04088 β’ Published β’ 158 -
Mistral 7B
Paper β’ 2310.06825 β’ Published β’ 47 -
Don't Make Your LLM an Evaluation Benchmark Cheater
Paper β’ 2311.01964 β’ Published β’ 1