bongsoo commited on
Commit
2eb8131
1 Parent(s): 3b64fd9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -25,14 +25,15 @@ model = AutoModel.from_pretrained('bongsoo/mbertV2.0')
25
  ## Training
26
 
27
  **MLM(Masked Langeuage Model) 훈련**
28
- - 입력 모델 : bert-base-multilingual-cased
29
  - 말뭉치 : 훈련 : bongsoo/moco-corpus-kowiki2022(7.6M) , 평가: bongsoo/bongevalsmall
30
  - HyperParameter : LearningRate : 5e-5, epochs: 8, batchsize: 32, max_token_len : 128
31
  - vocab : 152,537개 (기존 119,548 에 32,989 신규 vocab 추가)
32
  - 출력 모델 : mbertV2.0 (size: 776MB)
33
  - 훈련시간 : 90h/1GPU (24GB/19.6GB use)
34
- - loss : 훈련loss: 2.258400, 평가loss: 3.102096, [perplexity](https://github.com/kobongsoo/BERT/blob/master/bert/bert-perplexity-eval-V1.2.ipynb): 19.78158(bong_eval:1,500)
35
  - 훈련코드 [여기](https://github.com/kobongsoo/BERT/blob/master/bert/bert-MLM-Trainer-V1.2.ipynb) 참조
 
36
 
37
  ## Model Config
38
  ```
 
25
  ## Training
26
 
27
  **MLM(Masked Langeuage Model) 훈련**
28
+ - 입력 모델 : bert-base-multilingual-cased(vocab(119,548개))
29
  - 말뭉치 : 훈련 : bongsoo/moco-corpus-kowiki2022(7.6M) , 평가: bongsoo/bongevalsmall
30
  - HyperParameter : LearningRate : 5e-5, epochs: 8, batchsize: 32, max_token_len : 128
31
  - vocab : 152,537개 (기존 119,548 에 32,989 신규 vocab 추가)
32
  - 출력 모델 : mbertV2.0 (size: 776MB)
33
  - 훈련시간 : 90h/1GPU (24GB/19.6GB use)
34
+ - loss : 훈련loss: 2.258400, 평가loss: 3.102096, perplexity: 19.78158([bongsoo/bongeval](https://huggingface.co/datasets/bongsoo/bongeval):1,500)
35
  - 훈련코드 [여기](https://github.com/kobongsoo/BERT/blob/master/bert/bert-MLM-Trainer-V1.2.ipynb) 참조
36
+ <br>perplexity 평가 코드는 [여기](https://github.com/kobongsoo/BERT/blob/master/bert/bert-perplexity-eval-V1.2.ipynb) 참조
37
 
38
  ## Model Config
39
  ```