|
--- |
|
license: mit |
|
--- |
|
We provide both huggingface version and |
|
[esm version](https://github.com/facebookresearch/esm) of |
|
SaProt (see our github <https://github.com/SaProt/SaProt>). Users can choose either one to use. |
|
|
|
### Huggingface model |
|
The following code shows how to load the model. |
|
``` |
|
from transformers import EsmTokenizer, EsmForMaskedLM |
|
|
|
model_path = "/your/path/to/SaProt_650M_AF2" |
|
tokenizer = EsmTokenizer.from_pretrained(model_path) |
|
model = EsmForMaskedLM.from_pretrained(model_path) |
|
|
|
#################### Example #################### |
|
device = "cuda" |
|
model.to(device) |
|
|
|
seq = "MdEvVpQpLrVyQdYaKv" |
|
tokens = tokenizer.tokenize(seq) |
|
print(tokens) |
|
|
|
inputs = tokenizer(seq, return_tensors="pt") |
|
inputs = {k: v.to(device) for k, v in inputs.items()} |
|
|
|
outputs = model(**inputs) |
|
print(outputs.logits.shape) |
|
|
|
""" |
|
['Md', 'Ev', 'Vp', 'Qp', 'Lr', 'Vy', 'Qd', 'Ya', 'Kv'] |
|
torch.Size([1, 11, 446]) |
|
""" |
|
``` |
|
|
|
### esm model |
|
The esm version is also stored in the same folder, named `SaProt_650M_AF2.pt`. We provide a function to load the model. |
|
``` |
|
from utils.esm_loader import load_esm_saprot |
|
|
|
model_path = "/your/path/to/SaProt_650M_AF2.pt" |
|
model, alphabet = load_esm_saprot(model_path) |
|
``` |