Mathematical Structure Aware BERT

Pretrained model based on bert-base-cased with further mathematical pre-training.

Compared to bert-base-cased, 300 additional mathematical LaTeX tokens have been added before the mathematical pre-training.

Downloads last month
36
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train ddrg/math_structure_bert