This is a fine-tuned model on Medical domain for German language and based on German BERT. This model has only been trained to improve on target task (Masked Language Model). It can later be used to perform a downstream task of your needs, while I performed it for NTS-ICD-10 text classification task.
Language model: bert-base-german-cased
Fine-tuning: Medical articles (diseases, symptoms, therapies, etc..)
Eval data: NTS-ICD-10 dataset (Classification)
Infrastructure: Gogle Colab
shresthamanjil21 [at] gmail.com
Get in touch: LinkedIn