Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

mental-longformer-base-4096 is a model pretrained from the checkpoint of longformer-base-4096 for the mental healthcare domain. Longformer is a transformer model for long documents. longformer-base-4096 is a BERT-like model started from the RoBERTa checkpoint and pretrained for MLM on long documents. It supports sequences of length up to 4,096.

Usage

Load the model via Huggingface’s Transformers library:

from transformers import LongformerTokenizer, LongformerModel
tokenizer = LongformerTokenizer.from_pretrained("AIMH/mental-longformer-base-4096")
model = LongformerModel.from_pretrained("AIMH/mental-longformer-base-4096")

To minimize the influence of worrying mask predictions, this model is gated. To download a gated model, you’ll need to be authenticated. Know more about gated models.

Paper

@article{ji-domain-specific,
  author        = {Shaoxiong Ji and Tianlin Zhang and Kailai Yang and Sophia Ananiadou and Erik Cambria and J{\"o}rg Tiedemann},
  journal       = {arXiv preprint arXiv:2304.10447},
  title         = {Domain-specific Continued Pretraining of Language Models for Capturing Long Context in Mental Health},
  year          = {2023},
  url           = {https://arxiv.org/abs/2304.10447}
}

Disclaimer

The model predictions are not psychiatric diagnoses. We recommend anyone who suffers from mental health issues to call the local mental health helpline and seek professional help if possible.

Downloads last month
39