How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="NingLab/eCeLLM-S", trust_remote_code=True)
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("NingLab/eCeLLM-S", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("NingLab/eCeLLM-S", trust_remote_code=True)
Quick Links

eCeLLM-S

This repo contains the models for "eCeLLM: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data"

eCeLLM Models

Leveraging ECInstruct, we develop eCeLLM by instruction tuning general-purpose LLMs (base models). The eCeLLM-S model is instruction-tuned from the large base models Phi-2.

Citation

@inproceedings{
    peng2024ecellm,
    title={eCe{LLM}: Generalizing Large Language Models for E-commerce from Large-scale, High-quality Instruction Data},
    author={Bo Peng and Xinyi Ling and Ziru Chen and Huan Sun and Xia Ning},
    booktitle={Forty-first International Conference on Machine Learning},
    year={2024},
    url={https://openreview.net/forum?id=LWRI4uPG2X}
}
Downloads last month
126
Safetensors
Model size
3B params
Tensor type
F16
·
Inference Providers NEW

Dataset used to train NingLab/eCeLLM-S