|
--- |
|
license: cc-by-4.0 |
|
datasets: |
|
- NingLab/ECInstruct |
|
--- |
|
# eCeLLM-S |
|
|
|
This repo contains the models for "eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data" |
|
|
|
## eCeLLM Models |
|
Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models). |
|
The eCeLLM-S model is instruction-tuned from the large base models [Phi-2](https://huggingface.co/microsoft/phi-2). |
|
|
|
## Citation |
|
```bibtex |
|
@inproceedings{ |
|
peng2024ecellm, |
|
title={eCe{LLM}: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data}, |
|
author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning}, |
|
booktitle={Forty-first International Conference on Machine Learning}, |
|
year={2024}, |
|
url={https://openreview.net/forum?id=LWRI4uPG2X} |
|
} |
|
``` |