Text Generation
Transformers
Japanese
English
llama
text-generation-inference
keisawada commited on
Commit
066d5fc
1 Parent(s): c763333

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -4
README.md CHANGED
@@ -26,7 +26,7 @@ datasets:
26
 
27
  Refer to the [original model](https://huggingface.co/rinna/youri-7b-instruction) for fine-tuning details.
28
 
29
- * **Authors**
30
 
31
  - [Toshiaki Wakatsuki](https://huggingface.co/t-w)
32
  - [Tianyu Zhao](https://huggingface.co/tianyuz)
@@ -89,10 +89,19 @@ The model uses the original llama-2 tokenizer.
89
 
90
  # How to cite
91
  ~~~
92
- @misc{RinnaYouri7bInstructionGPTQ,
93
- url={https://huggingface.co/rinna/youri-7b-instruction-gptq},
94
- title={rinna/youri-7b-instruction-gptq},
95
  author={Wakatsuki, Toshiaki and Zhao, Tianyu and Sawada, Kei}
 
 
 
 
 
 
 
 
 
 
96
  }
97
  ~~~
98
  ---
 
26
 
27
  Refer to the [original model](https://huggingface.co/rinna/youri-7b-instruction) for fine-tuning details.
28
 
29
+ * **Contributors**
30
 
31
  - [Toshiaki Wakatsuki](https://huggingface.co/t-w)
32
  - [Tianyu Zhao](https://huggingface.co/tianyuz)
 
89
 
90
  # How to cite
91
  ~~~
92
+ @misc{rinna-youri-7b-instruction-gptq,
93
+ title = {rinna/youri-7b-instruction-gptq},
 
94
  author={Wakatsuki, Toshiaki and Zhao, Tianyu and Sawada, Kei}
95
+ url = {https://huggingface.co/rinna/youri-7b-instruction-gptq},
96
+ }
97
+
98
+ @inproceedings{sawada2024release,
99
+ title = {Release of Pre-Trained Models for the {J}apanese Language},
100
+ author = {Sawada, Kei and Zhao, Tianyu and Shing, Makoto and Mitsui, Kentaro and Kaga, Akio and Hono, Yukiya and Wakatsuki, Toshiaki and Mitsuda, Koh},
101
+ booktitle = {Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)},
102
+ month = {5},
103
+ year = {2024},
104
+ url = {https://arxiv.org/abs/2404.01657},
105
  }
106
  ~~~
107
  ---