mjbuehler commited on
Commit
c41ee8a
1 Parent(s): 573284b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -8,7 +8,7 @@ license: apache-2.0
8
 
9
  This protein language model is an autoregressive transformer model in GPT-style, trained to analyze and predict the mechanical properties of a large number of protein sequences.
10
 
11
- This protein language foundation model was based on the NeoGPT-X architecture and uses rotary positional embeddings (RoPE). It has 16 attention heads, 36 hidden layers and a hidden size of 1024, an intermediate size of 4086 and uses a GeLU activation function.
12
 
13
  The pretraining task is defined as "Sequence<...>" where ... is an amino acid sequence.
14
 
 
8
 
9
  This protein language model is an autoregressive transformer model in GPT-style, trained to analyze and predict the mechanical properties of a large number of protein sequences.
10
 
11
+ This protein language foundation model was based on the NeoGPT-X architecture and uses rotary positional embeddings (RoPE). It has 16 attention heads, 36 hidden layers and a hidden size of 1024, an intermediate size of 4096 and uses a GeLU activation function.
12
 
13
  The pretraining task is defined as "Sequence<...>" where ... is an amino acid sequence.
14