AbLang_light / README.md
qilowoq's picture
Create README.md
5b022c8
|
raw
history blame
1.67 kB
metadata
license: bsd
tags:
  - chemistry
  - biology
  - protein
  - antibodies
  - antibody
  - light chain
  - AbLang
  - CDR
  - OAS

AbLang model for light chains

This is a huggingface version of AbLang: A language model for antibodies. It was introduced in this paper and first released in this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.

Intended uses & limitations

The model could be used for protein feature extraction or to be fine-tuned on downstream tasks (TBA).

How to use

Here is how to use this model to get the features of a given antibody sequence in PyTorch:

from transformers import AutoModel, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained('qilowoq/AbLang_light')
model = AutoModel.from_pretrained('qilowoq/AbLang_light', trust_remote_code=True)

sequence_Example = ' '.join("DIQMTQSPSTLSASIGDTVRISCRASQSITGNWVAWYQQRPGKAPRLLIYRGAALLGGVPSRFSGSAAGTDFTLTIGNLQAEDFGTFYCQQYDTYPGTFGQGTKVEVKRTVAAPSVFIFPPSDEQLKSGTASVVCLLNNFYPREAKVQWKVDNALQSGNSQESVTEQDSKDSTYSLSSTLTLSKADYEKHKVYACEVTHQGLSSPVTKSFNR")
encoded_input = tokenizer(sequence_Example, return_tensors='pt')
model_output = model(encoded_input)

Sentence embeddings can be produced as follows:

seq_embs = model_output.last_hidden_state[:, 0, :]

Citation

@article{Olsen2022,
  title={AbLang: An antibody language model for completing antibody sequences},
  author={Tobias H. Olsen, Iain H. Moal and Charlotte M. Deane},
  journal={bioRxiv},
  doi={https://doi.org/10.1101/2022.01.20.477061},
  year={2022}
}