Fill-Mask
Transformers
PyTorch
English
roberta
earth science
climate
biology
Inference Endpoints
Muthukumaran commited on
Commit
857e8f7
1 Parent(s): 2609f8e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -19,6 +19,7 @@ nasa-smd-ibm-v0.1 is a RoBERTa-based, Encoder-only transformer model, domain-ada
19
  - **Tokenizer**: Custom
20
  - **Parameters**: 125M
21
  - **Pretraining Strategy**: Masked Language Modeling (MLM)
 
22
 
23
  ## Training Data
24
  - Wikipedia English (Feb 1, 2020)
 
19
  - **Tokenizer**: Custom
20
  - **Parameters**: 125M
21
  - **Pretraining Strategy**: Masked Language Modeling (MLM)
22
+ - **Distilled Version**: You can download a distilled version of the model (30 Million Parameters) here: https://drive.google.com/file/d/19s2Vv9WlmlRhh_AhzdP-s__0spQCG8cQ/view?usp=sharing
23
 
24
  ## Training Data
25
  - Wikipedia English (Feb 1, 2020)