hotchpotch commited on
Commit
f5eaf12
1 Parent(s): 0a60ab1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -33
README.md CHANGED
@@ -1,46 +1,23 @@
1
  ---
2
  tags:
3
  - generated_from_trainer
4
- model-index:
5
- - name: tmp_trainer
6
- results: []
7
  ---
8
 
9
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
- should probably proofread and complete it, then remove this comment. -->
11
 
12
- # tmp_trainer
13
 
14
- This model was trained from scratch on an unknown dataset.
15
 
16
- ## Model description
17
 
18
- More information needed
19
 
20
- ## Intended uses & limitations
21
 
22
- More information needed
23
 
24
- ## Training and evaluation data
25
 
26
- More information needed
27
-
28
- ## Training procedure
29
-
30
- ### Training hyperparameters
31
-
32
- The following hyperparameters were used during training:
33
- - learning_rate: 5e-05
34
- - train_batch_size: 8
35
- - eval_batch_size: 8
36
- - seed: 42
37
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
38
- - lr_scheduler_type: linear
39
- - num_epochs: 3.0
40
-
41
- ### Framework versions
42
-
43
- - Transformers 4.37.2
44
- - Pytorch 2.2.1+cu121
45
- - Datasets 2.17.1
46
- - Tokenizers 0.15.2
 
1
  ---
2
  tags:
3
  - generated_from_trainer
4
+ license: mit
 
 
5
  ---
6
 
7
+ # mMiniLMv2-L6-H384
 
8
 
9
+ This model is a re-upload of Microsoft's Multilingual [MiniLM v2](https://arxiv.org/abs/2012.15828) to HuggingFace under the MIT License, made more accessible for use through the HuggingFace transformers library.
10
 
11
+ The original pre-trained model is provided at the following URL:
12
 
13
+ - https://1drv.ms/u/s!AjHn0yEmKG8qiyC4-L624EmV2i7z
14
 
15
+ # License
16
 
17
+ The license for this model is based on the original license (found in the LICENSE file in the project's root directory), which is the MIT License.
18
 
19
+ - https://github.com/microsoft/unilm/tree/master/minilm
20
 
21
+ # Attribution
22
 
23
+ All credits for this model go to the authors of Multilingual MiniLM v2 and the associated researchers and organizations. When using this model, please be sure to attribute the original authors.