Mitch Naylor commited on
Commit
9fdc17d
1 Parent(s): ac09500

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -0
README.md CHANGED
@@ -1,4 +1,17 @@
1
  # PsychBERT
2
  This domain adapted language model is pretrained from the `bert-base-cased` checkpoint on masked language modeling, using a dataset of ~40,000 PubMed papers in the domain of psychology, psychiatry, mental health, and behavioral health; as well as a dastaset of roughly 200,000 social media conversations about mental health. This work is submitted as an entry for BIBM 2021.
3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4
  Authors: Vedant Vajre, Mitch Naylor, Uday Kamath, Amarda Shehu
 
1
  # PsychBERT
2
  This domain adapted language model is pretrained from the `bert-base-cased` checkpoint on masked language modeling, using a dataset of ~40,000 PubMed papers in the domain of psychology, psychiatry, mental health, and behavioral health; as well as a dastaset of roughly 200,000 social media conversations about mental health. This work is submitted as an entry for BIBM 2021.
3
 
4
+ **Note**: the widget to the right does not work with Flax models. In order to use the model, please pull it into a Python session as follows:
5
+
6
+ ```
7
+ from transformers import FlaxAutoModelForMaskedLM, AutoModelForMaskedLM
8
+
9
+ # load as a flax model
10
+ flax_lm = FlaxAutoModelForMaskedLM.from_pretrained('mnaylor/psychbert-cased')
11
+
12
+ # load as a pytorch model
13
+ # requires flax to be installed in your environment
14
+ pytorch_lm = AutoModelForMaskedLM.from_pretrained('mnaylor/psychbert-cased', from_flax=True)
15
+ ```
16
+
17
  Authors: Vedant Vajre, Mitch Naylor, Uday Kamath, Amarda Shehu