Update README.md
Browse files
README.md
CHANGED
@@ -9,4 +9,16 @@ This repo contains the models for "eCeLLM: Generalizing Large Language Models fo
|
|
9 |
|
10 |
## eCeLLM Models
|
11 |
Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models).
|
12 |
-
The eCeLLM-L model is instruction-tuned from the large base models [Llama-2 13B-chat](https://arxiv.org/abs/2307.09288).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
9 |
|
10 |
## eCeLLM Models
|
11 |
Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models).
|
12 |
+
The eCeLLM-L model is instruction-tuned from the large base models [Llama-2 13B-chat](https://arxiv.org/abs/2307.09288).
|
13 |
+
|
14 |
+
## Citation
|
15 |
+
```bibtex
|
16 |
+
@misc{peng2024ecellm,
|
17 |
+
title={eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data},
|
18 |
+
author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning},
|
19 |
+
year={2024},
|
20 |
+
eprint={2402.08831},
|
21 |
+
archivePrefix={arXiv},
|
22 |
+
primaryClass={cs.CL}
|
23 |
+
}
|
24 |
+
```
|