jtatman commited on
Commit
9676f2e
1 Parent(s): f426d71

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -2,6 +2,8 @@
2
  library_name: transformers
3
  tags:
4
  - experimental
 
 
5
  license: apache-2.0
6
  ---
7
 
@@ -23,4 +25,4 @@ The model will be used for layer analysis and trained on a close approximation o
23
 
24
  This process will be ongoing to see if rank stabilized tuning can save and enhance the original model information through recognizing original weight associations in the preserved layers, even after model resizing.
25
 
26
- There is a twin project with a more siginificant size reduction (600 million params) that is being used for training analysis here: [jtatman/sciphi-mini-600m](https://huggingface.co/jtatman/sciphi-mini-600m)
 
2
  library_name: transformers
3
  tags:
4
  - experimental
5
+ - mergekit
6
+ - model from scratch
7
  license: apache-2.0
8
  ---
9
 
 
25
 
26
  This process will be ongoing to see if rank stabilized tuning can save and enhance the original model information through recognizing original weight associations in the preserved layers, even after model resizing.
27
 
28
+ There is a twin (parent) project with a less siginificant size reduction (600 million params) that is being used for training analysis here: [jtatman/sciphi-mini-600m](https://huggingface.co/jtatman/sciphi-mini-600m)