Update README.md
Browse files
README.md
CHANGED
@@ -9,4 +9,16 @@ This repo contains the models for "eCeLLM: Generalizing Large Language Models fo
|
|
9 |
|
10 |
## eCeLLM Models
|
11 |
Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models).
|
12 |
-
The eCeLLM-S model is instruction-tuned from the large base models [Phi-2](https://www.microsoft.com/en-us/research/blog/phi-2-the-surprising-power-of-small-language-models/).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
9 |
|
10 |
## eCeLLM Models
|
11 |
Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models).
|
12 |
+
The eCeLLM-S model is instruction-tuned from the large base models [Phi-2](https://www.microsoft.com/en-us/research/blog/phi-2-the-surprising-power-of-small-language-models/).
|
13 |
+
|
14 |
+
## Citation
|
15 |
+
```bibtex
|
16 |
+
@misc{peng2024ecellm,
|
17 |
+
title={eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data},
|
18 |
+
author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning},
|
19 |
+
year={2024},
|
20 |
+
eprint={2402.08831},
|
21 |
+
archivePrefix={arXiv},
|
22 |
+
primaryClass={cs.CL}
|
23 |
+
}
|
24 |
+
```
|