cja5553's picture
Update README.md
1650dcb verified
|
raw
history blame
No virus
1.16 kB
metadata
license: mit
language:
  - en
library_name: transformers
pipeline_tag: text-classification
tags:
  - medical
base_model: emilyalsentzer/Bio_ClinicalBERT

BJH-perioperative-notes-bioClinicalBERT

This clinical foundational model is intended to predict post-operative surgical outcomes from clinical notes taken during perioperative care. It was finetuned from the emilyalsentzer/Bio_ClinicalBERT model through a multi-task learning approach, spanning the following 6 outcomes:

  • Death in 30 days
  • Deep vein thrombosis (DVT)
  • pulmonary embolism (PE)
  • Pneumonia
  • Acute Knee Injury
  • delirium

Dataset

We used 84,875 perioperative clinical notes from patients spanning the Barnes Jewish Hospital (BJH) system in St Louis, MO. The following are the characteristics for the data:

  • vocabulary size: 3203
  • averaging words per clinical note: 8.9 words
  • all single sentenced clinical notes

How to use model

from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("cja5553/BJH-perioperative-notes-bioClinicalBERT")
model = AutoModel.from_pretrained("cja5553/BJH-perioperative-notes-bioClinicalBERT")