Edit model card

Model Card for Astro-HEP-BERT

Astro-HEP-BERT is a bidirectional transformer designed primarily to generate contextualized word embeddings for analyzing epistemic change in astrophysics and high-energy physics (see NEPI research project). Built upon Google's bert-base-uncased, the model underwent additional training for three epochs using approximately 21.5 million paragraphs extracted from around 600,000 scholarly articles sourced from arXiv, all pertaining to astrophysics and/or high-energy physics (HEP). The sole training objective was masked language modeling.

The Astro-HEP-BERT project embodies the spirit of a tabletop experiment or grassroots scientific effort. It exclusively utilized open-source inputs during training, and the entire training process was completed on a single MacBook Pro M2/96GB over a span of 6 weeks for 3 epochs. This project stands as a proof of concept, showcasing the viability of employing a bidirectional transformer for research ventures in the history, philosophy, and sociology of science (HPSS) even with limited financial resources.

For further insights into the model, the corpus, and the underlying research project please refer to the Astro-HEP-BERT paper [link coming soon].

Model Details

  • Developer: Arno Simons
  • Funded by: European Research Council (ERC) under Grant agreement ID: 101044932
  • Language (NLP): English
  • License: apache-2.0
  • Parent model: Google's bert-base-uncased
Downloads last month
20
Safetensors
Model size
110M params
Tensor type
F32
·

Datasets used to train arnosimons/astro-hep-bert