Fill-Mask
Transformers
PyTorch
English
roberta
earth science
climate
biology
Inference Endpoints
Muthukumaran commited on
Commit
c2ee899
1 Parent(s): ee299e3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -23,7 +23,7 @@ nasa-smd-ibm-v0.1 (Currently named as Indus) is a RoBERTa-based, Encoder-only tr
23
  - **Tokenizer**: Custom
24
  - **Parameters**: 125M
25
  - **Pretraining Strategy**: Masked Language Modeling (MLM)
26
- - **Distilled Version**: You can download a distilled version of the model (30 Million Parameters) here: https://drive.google.com/file/d/19s2Vv9WlmlRhh_AhzdP-s__0spQCG8cQ/view?usp=sharing
27
 
28
  ## Training Data
29
  - Wikipedia English (Feb 1, 2020)
 
23
  - **Tokenizer**: Custom
24
  - **Parameters**: 125M
25
  - **Pretraining Strategy**: Masked Language Modeling (MLM)
26
+ - **Distilled Version**: You can download a distilled version of the model (30 Million Parameters) here: https://huggingface.co/nasa-impact/nasa-smd-ibm-distil-v0.1
27
 
28
  ## Training Data
29
  - Wikipedia English (Feb 1, 2020)