Edit model card

This is the one-directional model trained on 7 protein families:

  • PF00002 - GPCRs
  • PF00042 - Globins
  • PF00125 - Core histones
  • PF00127 - Copper binding proteins
  • PF00257 - Dehydrins
  • PF00262 - Calreticulins
  • PF03668 - P-loop ATPase

Check out the github repo for more information.

Example usage:

from transformers import AutoModelForCausalLM
from tokenizers import Tokenizer
# optionally use local imports
# from models.progen.modeling_progen import ProGenForCausalLM
# from models.progen.configuration_progen import ProGenConfig
import torch
import torch.nn.functional as F

# load model and tokenizer
model = AutoModelForCausalLM.from_pretrained("hugohrban/progen2-small-mix7", trust_remote_code=True)
tokenizer = Tokenizer.from_pretrained("hugohrban/progen2-small-mix7")
tokenizer.no_padding()

# prepare input
prompt = "<|pf03668|>1MEVVIVTGMSGAGK"
input_ids = torch.tensor(tokenizer.encode(prompt).ids).to(model.device)

# forward pass
logits = model(input_ids).logits

# print output probabilities
next_token_logits = logits[-1, :]
next_token_probs = F.softmax(next_token_logits, dim=-1)
for i in range(tokenizer.get_vocab_size(with_added_tokens=False)):
    print(f"{tokenizer.id_to_token(i)}: {100 * next_token_probs[i].item():.2f} %")
Downloads last month
53
Safetensors
Model size
164M params
Tensor type
F32
·
BOOL
·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.