Edit model card

dapBERT

DapBERT-multi is a BERT-like model trained based on the domain adaptive pretraining method (Gururangan et al.) for the patent domain. Bert-base-multilingual-cased is used as base for the training. The training dataset used consists of a corpus of 10,000,000 patent abstracts that have been filed between 1998-2020 in US and European patent offices as well as the World Intellectual Property Organization.

Downloads last month
3
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.