Safetensors
llama
nielsr HF Staff commited on
Commit
276d113
·
verified ·
1 Parent(s): 3b2a577

Improve model card: Add pipeline tag, library name, links, and sample usage

Browse files

This PR significantly enhances the model card for **CASLIE-L** by:

- Adding the `pipeline_tag: image-text-to-text` metadata to accurately reflect its multimodal capabilities (image and text input, text output) and improve discoverability on the Hugging Face Hub.
- Specifying `library_name: transformers` metadata, as evidenced by the `config.json` (`LlamaForCausalLM`, `transformers_version`) and the GitHub requirements, enabling the automated "how to use" widget.
- Including direct links to the paper ([Captions Speak Louder than Images: Generalizing Foundation Models for E-commerce from High-quality Multimodal Instruction Data](https://huggingface.co/papers/2410.17337)), the official [project page](https://ninglab.github.io/CASLIE/), and the [GitHub repository](https://github.com/ninglab/CASLIE).
- Integrating an "Introduction" section from the paper's abstract and GitHub README to provide more context about the model and the MMECInstruct dataset.
- Adding a "Sample Usage" section that directly presents the "Modality-unified Inference" command-line example from the GitHub README, ensuring that no code is made up and users have an accurate starting point for interacting with the model.

These improvements aim to make the model more informative, discoverable, and user-friendly for the community.

Files changed (1) hide show
  1. README.md +26 -4
README.md CHANGED
@@ -1,18 +1,40 @@
1
  ---
2
- license: cc-by-4.0
3
- datasets:
4
- - NingLab/MMECInstruct
5
  base_model:
6
  - meta-llama/Llama-2-13b-chat-hf
 
 
 
 
 
7
  ---
8
 
9
  # CASLIE-L
10
 
11
- This repo contains the models for "Captions Speak Louder than Images (CASLIE): Generalizing Foundation Models for E-commerce from High-quality Multimodal Instruction Data"
 
 
 
 
 
 
12
 
13
  ## CASLIE Models
14
  The CASLIE-L model is instruction-tuned from the large base model [Llama-2-13b-chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf).
15
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
  ## Citation
17
  ```bibtex
18
  @article{ling2024captions,
 
1
  ---
 
 
 
2
  base_model:
3
  - meta-llama/Llama-2-13b-chat-hf
4
+ datasets:
5
+ - NingLab/MMECInstruct
6
+ license: cc-by-4.0
7
+ library_name: transformers
8
+ pipeline_tag: image-text-to-text
9
  ---
10
 
11
  # CASLIE-L
12
 
13
+ This repository contains the models for "[Captions Speak Louder than Images: Generalizing Foundation Models for E-commerce from High-quality Multimodal Instruction Data](https://huggingface.co/papers/2410.17337)".
14
+
15
+ **Project Page**: [https://ninglab.github.io/CASLIE/](https://ninglab.github.io/CASLIE/)
16
+ **Code Repository**: [https://github.com/ninglab/CASLIE](https://github.com/ninglab/CASLIE)
17
+
18
+ ## Introduction
19
+ Leveraging multimodal data to drive breakthroughs in e-commerce applications through Multimodal Foundation Models (MFMs) is gaining increasing attention. This work introduces [MMECInstruct](https://huggingface.co/datasets/NingLab/MMECInstruct), the first-ever, large-scale, and high-quality multimodal instruction dataset for e-commerce. We also develop CASLIE, a simple, lightweight, yet effective framework for integrating multimodal information for e-commerce. Leveraging MMECInstruct, we fine-tune a series of e-commerce MFMs within CASLIE, denoted as CASLIE models.
20
 
21
  ## CASLIE Models
22
  The CASLIE-L model is instruction-tuned from the large base model [Llama-2-13b-chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf).
23
 
24
+ ## Sample Usage (Modality-unified Inference)
25
+ To conduct inference with the CASLIE models, refer to the following example directly from the [official GitHub repository](https://github.com/ninglab/CASLIE#modality-unified-inference).
26
+
27
+ `$model_path` is the path of the instruction-tuned model.
28
+
29
+ `$task` specifies the task to be tested.
30
+
31
+ `$output_path` specifies the path where you want to save the inference output.
32
+
33
+ Example:
34
+ ```
35
+ python inference.py --model_path NingLab/CASLIE-M --task answerability_prediction --output_path ap.json
36
+ ```
37
+
38
  ## Citation
39
  ```bibtex
40
  @article{ling2024captions,