Text Generation
English
Megatron-LM
nvidia
Retro
InstructRetro
48B
boxin-wbx commited on
Commit
66063f6
1 Parent(s): 9fe3db3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -34,6 +34,8 @@ Retro also provides the flexibility to update the
34
  knowledge stored in LMs [(Wang et al., 2023a)](https://arxiv.org/abs/2304.06762)
35
  by updating the retrieval database without training LMs again.
36
 
 
 
37
  ### License
38
 
39
  The use of this model is governed by the [NVIDIA AI Foundation Models Community License Agreement](https://developer.nvidia.com/downloads/nv-ai-foundation-models-license).
@@ -128,7 +130,7 @@ _Boxin Wang, Wei Ping, Peng Xu, Lawrence McAfee, Zihan Liu, Mohammad Shoeybi, Yi
128
 
129
  [InstructRetro: Instruction Tuning post Retrieval-Augmented Pretraining.](https://arxiv.org/abs/2310.07713)
130
 
131
- _Boxin Wang, Wei Ping, Lawrence McAfee, Peng Xu, Bo Li, Mohammad Shoeybi, Bryan Catanzaro._
132
 
133
  Please cite the papers as follows if you use the data or code from this repo:
134
 
 
34
  knowledge stored in LMs [(Wang et al., 2023a)](https://arxiv.org/abs/2304.06762)
35
  by updating the retrieval database without training LMs again.
36
 
37
+ ## Overview
38
+
39
  ### License
40
 
41
  The use of this model is governed by the [NVIDIA AI Foundation Models Community License Agreement](https://developer.nvidia.com/downloads/nv-ai-foundation-models-license).
 
130
 
131
  [InstructRetro: Instruction Tuning post Retrieval-Augmented Pretraining.](https://arxiv.org/abs/2310.07713)
132
 
133
+ _Boxin Wang, Wei Ping, Lawrence McAfee, Peng Xu, Bo Li, Mohammad Shoeybi, Bryan Catanzaro._ (ICML 2024)
134
 
135
  Please cite the papers as follows if you use the data or code from this repo:
136