--- license: cc-by-4.0 datasets: - NingLab/ECInstruct --- # eCeLLM-L This repo contains the models for "eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data" ## eCeLLM Models Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models). The eCeLLM-L model is instruction-tuned from the large base models [Llama-2 13B-chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf). ## Citation ```bibtex @misc{peng2024ecellm, title={eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data}, author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning}, year={2024}, eprint={2402.08831}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```