damlab commited on
Commit
a2dfe01
1 Parent(s): 865e25d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -6
README.md CHANGED
@@ -8,10 +8,9 @@ widget:
8
 
9
  ---
10
 
11
- # Model Card for [HIV_V3_coreceptor]
12
 
13
  ## Table of Contents
14
- - [Table of Contents](#table-of-contents)
15
  - [Summary](#model-summary)
16
  - [Model Description](#model-description)
17
  - [Intended Uses & Limitations](#intended-uses-&-limitations)
@@ -25,7 +24,7 @@ widget:
25
 
26
  ## Summary
27
 
28
- The HIV-BERT-Coreceptor model was trained as a refinement of the HIV-BERT model (insert link) and serves to better predict HIV V3 coreceptor tropism. HIV-BERT is a model refined from the [ProtBert-BFD model](https://huggingface.co/Rostlab/prot_bert_bfd) to better fulfill HIV-centric tasks. This model was then trained using HIV V3 sequences from the [Los Alamos HIV Sequence Database](https://www.hiv.lanl.gov/content/sequence/HIV/mainpage.html), allowing even more precise prediction of V3 coreceptor tropism than the HIV-BERT model can provide.
29
 
30
  ## Model Description
31
 
@@ -43,17 +42,17 @@ This tool was trained using the [Los Alamos HIV sequence dataset](https://www.hi
43
 
44
  ## Training Data
45
 
46
- This model was trained using the damlab/HIV_V3_coreceptor dataset using the 0th fold. The dataset consists of 2935 V3 sequences (approximately 35 tokens each) extracted from the Los Alamos HIV Sequence database.
47
 
48
  ## Training Procedure
49
 
50
  ### Preprocessing
51
 
52
- As with the rostlab/Prot-bert-bfd model, the rare amino acids U, Z, O, and B were converted to X and spaces were added between each amino acid. All strings were concatenated and chunked into 256 token chunks for training. A random 20% of chunks were held for validation.
53
 
54
  ### Training
55
 
56
- The damlab/HIV-BERT model was used as the initial weights for an AutoModelforClassificiation. The model was trained with a learning rate of 1E-5, 50K warm-up steps, and a cosine_with_restarts learning rate schedule and continued until 3 consecutive epochs did not improve the loss on the held-out dataset. As this is a multiple classification task (a protein can bind to CCR5, CXCR4, neither, or both) the loss was calculated as the Binary Cross Entropy for each category. The BCE was weighted by the inverse of the class ratio to balance the weight across the class imbalance.
57
 
58
  ## Evaluation Results
59
 
 
8
 
9
  ---
10
 
11
+ # HIV_V3_coreceptor model
12
 
13
  ## Table of Contents
 
14
  - [Summary](#model-summary)
15
  - [Model Description](#model-description)
16
  - [Intended Uses & Limitations](#intended-uses-&-limitations)
 
24
 
25
  ## Summary
26
 
27
+ The HIV-BERT-Coreceptor model was trained as a refinement of the [HIV-BERT model](https://huggingface.co/damlab/HIV_BERT) and serves to better predict HIV V3 coreceptor tropism. HIV-BERT is a model refined from the [ProtBert-BFD model](https://huggingface.co/Rostlab/prot_bert_bfd) to better fulfill HIV-centric tasks. This model was then trained using HIV V3 sequences from the [Los Alamos HIV Sequence Database](https://www.hiv.lanl.gov/content/sequence/HIV/mainpage.html), allowing even more precise prediction of V3 coreceptor tropism than the HIV-BERT model can provide.
28
 
29
  ## Model Description
30
 
 
42
 
43
  ## Training Data
44
 
45
+ This model was trained using the [damlab/HIV_V3_coreceptor dataset](https://huggingface.co/datasets/damlab/HIV_V3_coreceptor) using the 0th fold. The dataset consists of 2935 V3 sequences (approximately 35 tokens each) extracted from the [Los Alamos HIV Sequence database](https://www.hiv.lanl.gov/content/sequence/HIV/mainpage.html).
46
 
47
  ## Training Procedure
48
 
49
  ### Preprocessing
50
 
51
+ As with the [rostlab/Prot-bert-bfd model](https://huggingface.co/Rostlab/prot_bert_bfd), the rare amino acids U, Z, O, and B were converted to X and spaces were added between each amino acid. All strings were concatenated and chunked into 256 token chunks for training. A random 20% of chunks were held for validation.
52
 
53
  ### Training
54
 
55
+ The [damlab/HIV-BERT model](https://huggingface.co/damlab/HIV_BERT) was used as the initial weights for an AutoModelforClassificiation. The model was trained with a learning rate of 1E-5, 50K warm-up steps, and a cosine_with_restarts learning rate schedule and continued until 3 consecutive epochs did not improve the loss on the held-out dataset. As this is a multiple classification task (a protein can bind to CCR5, CXCR4, neither, or both) the loss was calculated as the Binary Cross Entropy for each category. The BCE was weighted by the inverse of the class ratio to balance the weight across the class imbalance.
56
 
57
  ## Evaluation Results
58