Sentence Similarity
sentence-transformers
PyTorch
English
bert
feature-extraction
mteb
custom_code
Eval Results
6 papers
dylanAtHum commited on
Commit
9bbc2d0
1 Parent(s): e33bf52

Update Replication Instructions to Use Script to Load Pretrained Model

Browse files
Files changed (2) hide show
  1. Replication.txt +1 -1
  2. load_mosaic.py +23 -0
Replication.txt CHANGED
@@ -22,7 +22,7 @@ In order to run the training process with our specific model, we need to make a
22
 
23
  To alter the sentence-transformers library, clone the repository from https://github.com/UKPLab/sentence-transformers locally and replace the SentenceTransformer.py and Transformer.py files located within the sentence-transformers/sentence_transformers/ and sentence-transformers/sentence_transformers/models/ directories of the cloned repository, respectively, with those located inside dev/ folder. (This has already been done in this notebook instance, but this will have to be completed if training on another system.)
24
 
25
- Before conducting actual training, we also need to clone the mosaic-bert-base-seqlen-2048 model locally and make a few small changes to its config.json file. Running Mosaic_Model.ipynb will execute this process and get our model ready to begin training. (Again, this has already been done in this notebook instance, but this will have to be completed if training on another system.)
26
 
27
  Training
28
 
 
22
 
23
  To alter the sentence-transformers library, clone the repository from https://github.com/UKPLab/sentence-transformers locally and replace the SentenceTransformer.py and Transformer.py files located within the sentence-transformers/sentence_transformers/ and sentence-transformers/sentence_transformers/models/ directories of the cloned repository, respectively, with those located inside dev/ folder. (This has already been done in this notebook instance, but this will have to be completed if training on another system.)
24
 
25
+ Before conducting actual training, we also need to clone the mosaic-bert-base-seqlen-2048 model locally and make a few small changes to its config.json file. Running load_mosiac.py will execute this process and get the model ready to begin training. (Again, this has already been done in this notebook instance, but this will have to be completed if training on another system.)
26
 
27
  Training
28
 
load_mosaic.py ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from huggingface_hub import snapshot_download
2
+ import json
3
+ import os
4
+
5
+ REPO_ID = "mosaicml/mosaic-bert-base-seqlen-2048"
6
+ MODEL_DIRECTORY = "mosaic-bert-base-seqlen-2048"
7
+
8
+
9
+ def main():
10
+ snapshot_download(repo_id=REPO_ID, local_dir=MODEL_DIRECTORY)
11
+
12
+ # modify the model's config.json file to satisfy our requirements
13
+ config_file_path = os.path.join(MODEL_DIRECTORY, 'config.json')
14
+ contents = json.load(open(config_file_path))
15
+ contents['architectures'] = ['BertModel']
16
+ contents['auto_map']['AutoModel'] = 'bert_layers.BertModel'
17
+ contents['torch_dtype'] = 'bfloat16'
18
+ contents['transformers_version'] = '4.28.1'
19
+ contents['_name_or_path'] = 'mosaic-bert-base-seqlen-2048'
20
+ json.dump(contents, open(config_file_path, 'w'), ensure_ascii=True)
21
+
22
+ if __name__ == '__main__':
23
+ main()