lots-o commited on
Commit
c90d80c
1 Parent(s): 771ce19

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -4
README.md CHANGED
@@ -2,8 +2,6 @@
2
  license: apache-2.0
3
  language:
4
  - ko
5
- tags:
6
- - arxiv:1909.11942
7
  ---
8
 
9
  # Korean ALBERT
@@ -55,8 +53,11 @@ model = AutoModel.from_pretrained("lots-o/ko-albert-large-v1")
55
  - The GCP/TPU environment used for training the ALBERT Model was supported by the [TRC](https://sites.research.google/trc/about/) program.
56
 
57
  # Reference
 
 
 
 
58
  - [google-albert](https://github.com/google-research/albert)
59
  - [albert-zh](https://github.com/brightmart/albert_zh)
60
  - [KcBERT](https://github.com/Beomi/KcBERT)
61
- - [KcBERT-Finetune](https://github.com/Beomi/KcBERT-finetune)
62
-
 
2
  license: apache-2.0
3
  language:
4
  - ko
 
 
5
  ---
6
 
7
  # Korean ALBERT
 
53
  - The GCP/TPU environment used for training the ALBERT Model was supported by the [TRC](https://sites.research.google/trc/about/) program.
54
 
55
  # Reference
56
+ ## Paper
57
+ - [ALBERT: A Lite BERT for Self-supervised Learning of Language Representations](https://arxiv.org/abs/1909.11942)
58
+
59
+ ## Github Repos
60
  - [google-albert](https://github.com/google-research/albert)
61
  - [albert-zh](https://github.com/brightmart/albert_zh)
62
  - [KcBERT](https://github.com/Beomi/KcBERT)
63
+ - [KcBERT-Finetune](https://github.com/Beomi/KcBERT-finetune)