wangyulong commited on
Commit
71863d5
•
1 Parent(s): d0f25be

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -2
README.md CHANGED
@@ -4,10 +4,12 @@ language:
4
  license: apache-2.0
5
  ---
6
 
7
-
8
  # Mengzi-oscar-base (Chinese Multi-modal pre-training model)
9
  Mengzi-oscar is trained based on the Multi-modal pre-training model [Oscar](https://github.com/microsoft/Oscar), and is initialized using [Mengzi-Bert-Base](https://github.com/Langboat/Mengzi). 3.7M pairs of images and texts were used, including 0.7M Chinese image-caption pairs, 3M Chinese image-question pairs, a total of 0.22M different images.
10
 
 
 
 
11
  ## Usage
12
  #### Installation
13
  Check [INSTALL.md](https://github.com/microsoft/Oscar/blob/master/INSTALL.md) for installation instructions.
@@ -17,5 +19,12 @@ See the [Mengzi-Oscar.md](https://github.com/Langboat/Mengzi/blob/main/Mengzi-Os
17
  ## Citation
18
  If you find the technical report or resource is useful, please cite the following technical report in your paper.
19
  ```
20
- example
 
 
 
 
 
 
 
21
  ```
4
  license: apache-2.0
5
  ---
6
 
 
7
  # Mengzi-oscar-base (Chinese Multi-modal pre-training model)
8
  Mengzi-oscar is trained based on the Multi-modal pre-training model [Oscar](https://github.com/microsoft/Oscar), and is initialized using [Mengzi-Bert-Base](https://github.com/Langboat/Mengzi). 3.7M pairs of images and texts were used, including 0.7M Chinese image-caption pairs, 3M Chinese image-question pairs, a total of 0.22M different images.
9
 
10
+ [Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese](https://arxiv.org/abs/2110.06696)
11
+
12
+
13
  ## Usage
14
  #### Installation
15
  Check [INSTALL.md](https://github.com/microsoft/Oscar/blob/master/INSTALL.md) for installation instructions.
19
  ## Citation
20
  If you find the technical report or resource is useful, please cite the following technical report in your paper.
21
  ```
22
+ @misc{zhang2021mengzi,
23
+ title={Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese},
24
+ author={Zhuosheng Zhang and Hanqing Zhang and Keming Chen and Yuhang Guo and Jingyun Hua and Yulong Wang and Ming Zhou},
25
+ year={2021},
26
+ eprint={2110.06696},
27
+ archivePrefix={arXiv},
28
+ primaryClass={cs.CL}
29
+ }
30
  ```