Collections
Discover the best community collections!
Collections including paper arxiv:2311.16079
-
Attention Is All You Need
Paper β’ 1706.03762 β’ Published β’ 49 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper β’ 1810.04805 β’ Published β’ 16 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper β’ 1907.11692 β’ Published β’ 7 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper β’ 1910.01108 β’ Published β’ 14
-
Clinical Document Corpora and Assorted Domain Proxies: A Survey of Diversity in Corpus Design, with Focus on German Text Data
Paper β’ 2412.00230 β’ Published β’ 1 -
AfriMed-QA: A Pan-African, Multi-Specialty, Medical Question-Answering Benchmark Dataset
Paper β’ 2411.15640 β’ Published β’ 4 -
ClinicalBench: Can LLMs Beat Traditional ML Models in Clinical Prediction?
Paper β’ 2411.06469 β’ Published β’ 17 -
Medical Adaptation of Large Language and Vision-Language Models: Are We Making Progress?
Paper β’ 2411.04118 β’ Published β’ 1
-
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits
Paper β’ 2402.17764 β’ Published β’ 604 -
Mixtral of Experts
Paper β’ 2401.04088 β’ Published β’ 158 -
Mistral 7B
Paper β’ 2310.06825 β’ Published β’ 47 -
Don't Make Your LLM an Evaluation Benchmark Cheater
Paper β’ 2311.01964 β’ Published β’ 1