KaizeShi commited on
Commit
10f34ba
·
verified ·
1 Parent(s): 88a4c46

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -5
README.md CHANGED
@@ -31,11 +31,28 @@ python inference.py --load_8bit --base_model 'meta-llama/Llama-2-7b-hf' --lora_w
31
  If you find our work helpful, please consider [citing][paper] the following papers.
32
 
33
  ```bibtex
34
- @article{shi2023llama,
35
- title={LLaMA-E: Empowering E-commerce Authoring with Multi-Aspect Instruction Following},
36
- author={Shi, Kaize and Sun, Xueyao and Wang, Dingxian and Fu, Yinlin and Xu, Guandong and Li, Qing},
37
- journal={arXiv preprint arXiv:2308.04913},
38
- year={2023}
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
39
  }
40
  ```
41
 
 
31
  If you find our work helpful, please consider [citing][paper] the following papers.
32
 
33
  ```bibtex
34
+ @inproceedings{shi-etal-2025-llama,
35
+ title = "{LL}a{MA}-{E}: Empowering {E}-commerce Authoring with Object-Interleaved Instruction Following",
36
+ author = "Shi, Kaize and
37
+ Sun, Xueyao and
38
+ Wang, Dingxian and
39
+ Fu, Yinlin and
40
+ Xu, Guandong and
41
+ Li, Qing",
42
+ editor = "Rambow, Owen and
43
+ Wanner, Leo and
44
+ Apidianaki, Marianna and
45
+ Al-Khalifa, Hend and
46
+ Eugenio, Barbara Di and
47
+ Schockaert, Steven",
48
+ booktitle = "Proceedings of the 31st International Conference on Computational Linguistics",
49
+ month = jan,
50
+ year = "2025",
51
+ address = "Abu Dhabi, UAE",
52
+ publisher = "Association for Computational Linguistics",
53
+ url = "https://aclanthology.org/2025.coling-main.58/",
54
+ pages = "870--885",
55
+ abstract = "E-commerce authoring entails creating engaging, diverse, and targeted content to enhance preference elicitation and retrieval experience. While Large Language Models (LLMs) have revolutionized content generation, they often fall short in e-commerce applications due to their limited memorization of domain-specific features. This paper proposes LLaMA-E, the unified e-commerce authoring models that address the contextual preferences of customers, sellers, and platforms, the essential objects in e-commerce operation. We design the instruction set derived from tasks of ads generation, query-enhanced product title rewriting, product classification, purchase intent speculation, and general e-commerce Q{\&}A. The instruction formulation ensures the interleaved cover of the presented and required object features, allowing the alignment of base models to parameterize e-commerce knowledge comprehensively. The proposed LLaMA-E models achieve state-of-the-art evaluation performance and exhibit the advantage in zero-shot practical applications. To our knowledge, this is the first LLM tailored to empower authoring applications with comprehensive scenario understanding by integrating features focused on participated objects."
56
  }
57
  ```
58