-
Attention Is All You Need
Paper • 1706.03762 • Published • 40 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper • 1810.04805 • Published • 14 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper • 1907.11692 • Published • 7 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper • 1910.01108 • Published • 12
Collections
Discover the best community collections!
Collections including paper arxiv:2311.16079
-
Large Language Models as Biomedical Hypothesis Generators: A Comprehensive Evaluation
Paper • 2407.08940 • Published -
CLIMB: A Benchmark of Clinical Bias in Large Language Models
Paper • 2407.05250 • Published • 1 -
How do you know that? Teaching Generative Language Models to Reference Answers to Biomedical Questions
Paper • 2407.05015 • Published • 4 -
BioMNER: A Dataset for Biomedical Method Entity Recognition
Paper • 2406.20038 • Published