File size: 2,379 Bytes
44246dc 79c7c89 b5069db b450c82 44246dc 79c7c89 579aaa7 79c7c89 d0100a5 79c7c89 579aaa7 79c7c89 5bd7699 79c7c89 5bd7699 a7a9d3f 79c7c89 5bd7699 79c7c89 a7a9d3f 79c7c89 a7a9d3f 79c7c89 a7a9d3f 79c7c89 a7a9d3f 79c7c89 a7a9d3f 79c7c89 b5069db |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 |
---
license: apache-2.0
language:
- en
pipeline_tag: fill-mask
datasets:
- wikipedia
- bookcorpus
tags:
- physics
- astrophysics
- high-energy physics (HEP)
- history of science
- philosophy of science
- sociology of science
- epistemic change
---
# Model Card for Astro-HEP-BERT
**Astro-HEP-BERT** is a bidirectional transformer designed primarily to generate contextualized word embeddings for analyzing epistemic change in astrophysics and high-energy physics (<a target="_blank" rel="noopener noreferrer" href="https://doi.org/10.3030/101044932" >NEPI project</a> at TU Berlin). Built upon Google's "bert-base-uncased," the model underwent additional training for three epochs using approximately 21.5 million paragraphs extracted from around 600,000 scholarly articles sourced from arXiv, all pertaining to astrophysics and/or high-energy physics (HEP). The sole training objective was masked language modeling.
The Astro-HEP-BERT project embodies the spirit of a tabletop experiment or grassroots scientific effort. It exclusively utilized open-source inputs during training, and the entire training process was completed on a single MacBook Pro M2/96GB over a span of 6 weeks for 3 epochs. This project stands as a proof of concept, showcasing the viability of employing a bidirectional transformer for research ventures in the history, philosophy, and sociology of science (HPSS) even with limited financial resources.
For further insights into the model, the corpus, and the underlying research project please refer to the Astro-HEP-BERT paper [link coming soon].
<!-- <a target="_blank" rel="noopener noreferrer" href="">Astro-HEP-BERT paper</a>. -->
## Model Details
- **Developer:** <a target="_blank" rel="noopener noreferrer" href="https://www.tu.berlin/en/hps-mod-sci/arno-simons">Arno Simons</a>
- **Funded by:** European Research Council (ERC) under Grant agreement ID: <a target="_blank" rel="noopener noreferrer" href="https://doi.org/10.3030/101044932" >101044932</a>
- **Language (NLP):** English
- **License:** apache-2.0
- **Parent model:** Google's "<a target="_blank" rel="noopener noreferrer" href="https://github.com/google-research/bert">bert-base-uncased</a>"
<!---
## How to Get Started with the Model
Use the code below to get started with the model.
[Coming soon]
## Citation
**BibTeX:**
[Coming soon]
**APA:**
[Coming soon]
--> |