File size: 822 Bytes
0649c96 883d100 0649c96 9c64b43 883d100 edac41b 4987f1f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
license: cc-by-4.0
datasets:
- xin10/ECInstruct
---
# eCeLLM-L
This repo contains the models for "eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data"
## eCeLLM Models
Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models).
The eCeLLM-L model is instruction-tuned from the large base models [Llama-2 13B-chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf).
## Citation
```bibtex
@misc{peng2024ecellm,
title={eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data},
author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning},
year={2024},
eprint={2402.08831},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |