File size: 428 Bytes
26d1907 b27ffe0 405d721 0ac854d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
---
license: mit
language: protein
tags:
- protein language model
datasets:
- Uniref50
---
# DistilProtBert model
Distilled protein language of [ProtBert](https://huggingface.co/Rostlab/prot_bert).
In addition to cross entropy and cosine teacher-student losses, DistilProtBert was pretrained on a masked language modeling (MLM) objective and it only works with capital letter amino acids.
# Model description
|