Pretergeek commited on
Commit
1cbe4e7
1 Parent(s): 4cce2a2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -120,7 +120,7 @@ The authors of the paper added new layers interleaved in between the original la
120
 
121
  I used the same method but added the new layers to the end of the model. My rationale is that the level of abstraction increases with each layer of the model. So, while new layers spread along the original layers will help the model to learn new tasks, adding layers to the end of the model and then re-training/fine-tuning the model on tasks it already performs well could improve the models understanding of those task and perform them better by employing more complex reasoning.
122
 
123
- This model has not yet received additional training, so it should perform close to the original model. Evaluations are pending and will be added when available.
124
 
125
  ### Models Merged
126
 
 
120
 
121
  I used the same method but added the new layers to the end of the model. My rationale is that the level of abstraction increases with each layer of the model. So, while new layers spread along the original layers will help the model to learn new tasks, adding layers to the end of the model and then re-training/fine-tuning the model on tasks it already performs well could improve the models understanding of those task and perform them better by employing more complex reasoning.
122
 
123
+ This model has not yet received additional training, so it should perform close to the original model.
124
 
125
  ### Models Merged
126