mickey1976 commited on
Commit
cda6e99
·
verified ·
1 Parent(s): 71f761a

Full upload: dataset with FAISS, NPY, Parquet, JSON, etc.

Browse files
.gitattributes CHANGED
@@ -62,3 +62,6 @@ meta.json filter=lfs diff=lfs merge=lfs -text
62
  *.index filter=lfs diff=lfs merge=lfs -text
63
  items_beauty_concat.faiss filter=lfs diff=lfs merge=lfs -text
64
  items_beauty_weighted.faiss filter=lfs diff=lfs merge=lfs -text
 
 
 
 
62
  *.index filter=lfs diff=lfs merge=lfs -text
63
  items_beauty_concat.faiss filter=lfs diff=lfs merge=lfs -text
64
  items_beauty_weighted.faiss filter=lfs diff=lfs merge=lfs -text
65
+ faiss/items_beauty_concat.faiss filter=lfs diff=lfs merge=lfs -text
66
+ faiss/items_beauty_weighted.faiss filter=lfs diff=lfs merge=lfs -text
67
+ json/meta.json filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -1,38 +1,162 @@
1
- ---
2
- license: cc-by-nc-4.0
3
- ---
4
- Here is a shorter version of the README.md suitable for the Hugging Face dataset card view (top-level summary users see when browsing your dataset):
5
-
6
-
7
-
8
-
9
- # 📦 Amazon Beauty Subset for MMR-Agentic-CoVE
10
-
11
- This dataset powers the **MMR-Agentic-CoVE** recommender system and contains a compact, multimodal slice of the Amazon Beauty product data. It includes:
12
-
13
- - ✅ JSON configs & sequences
14
- - ✅ NPY embeddings (text, image, meta, CoVE)
15
- - Parquet structured tables
16
- - ✅ PEFT model weights (LoRA/adapter)
17
- - FAISS indexes for fast retrieval
18
-
19
- ## 🧭 Folder Structure
20
-
21
- json/ → ID maps, defaults, sequences
22
- npy/ → Embeddings & logits
23
- parquet/ → Metadata & user-item tables
24
- model/ → Fine-tuned model weights
25
- faiss/ → Item FAISS indexes
26
-
27
- ## 🔌 Paired Spaces
28
-
29
- - **API Backend** → [CoVE API](https://huggingface.co/spaces/mickey1976/cove-api)
30
- - **Gradio UI** → [CoVE UI](https://huggingface.co/spaces/mickey1976/cove-ui)
31
-
32
- ## 📚 Citation
33
-
34
- > Ni, J., et al. (2019). *Amazon Review Dataset*. UCSD.
35
- > https://nijianmo.github.io/amazon/index.html
36
-
37
- Maintained by [@mickey1976](https://huggingface.co/mickey1976)
38
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - recommender
4
+ - multimodal
5
+ - amazon
6
+ - beauty
7
+ - json
8
+ - npy
9
+ - parquet
10
+ - faiss
11
+ - lora
12
+ - huggingface-dataset
13
+ ---
14
+
15
+ # 📦 Amazon Beauty Subset for MMR-Agentic-CoVE
16
+
17
+ This dataset contains preprocessed files for the "Beauty" category from the Amazon Reviews dataset.
18
+ It supports the MMR-Agentic-CoVE recommender system, including FAISS indexes, LoRA-tuned model weights, and multimodal features.
19
+
20
+ Use this dataset with the backend [`cove-api`](https://huggingface.co/spaces/mickey1976/cove-api) and frontend [`cove-ui`](https://huggingface.co/spaces/mickey1976/cove-ui) for live testing.
21
+
22
+
23
+ -
24
+ license: cc-by-nc-4.0
25
+ ---
26
+ Here is a revised and complete version of your README.md for the Hugging Face dataset repo mayankc-amazon_beauty_subset, reflecting the reorganized folder structure, usage examples, and links to your API/UI Spaces:
27
+
28
+
29
+
30
+
31
+ # Amazon Beauty Subset – Structured Dataset for MMR-Agentic-CoVE Recommender
32
+
33
+ This is a clean, categorized subset of the **Amazon Beauty Products Dataset** curated for the [MMR-Agentic-CoVE](https://huggingface.co/spaces/mickey1976/cove-ui) recommender system. It includes multimodal item data (text, image, metadata), user interactions, FAISS indexes, model outputs, and embedding vectors — all organized for efficient retrieval by the API and UI spaces.
34
+
35
+ ## 🗂️ Folder Structure
36
+
37
+ .
38
+ ├── json/ # Configs, maps, user/item sequences
39
+ ├── npy/ # Embedding arrays (text, image, meta, CoVE)
40
+ ├── parquet/ # Tabular structured data
41
+ ├── model/ # PEFT/LoRA model weights
42
+ ├── faiss/ # FAISS index files for nearest neighbor search
43
+ └── README.md
44
+
45
+ ## 📁 Key Files
46
+
47
+ ### `json/`
48
+ - `defaults.json`: Weight config for fusion modes
49
+ - `item_ids.json`, `user_seq.json`, `cove_item_ids.json`: ID mappings and test sets
50
+
51
+ ### `npy/`
52
+ - `text.npy`, `image.npy`, `meta.npy`: Item modality embeddings
53
+ - `cove_logits.npy`, `full_cove_embeddings.npy`: CoVE model outputs
54
+
55
+ ### `parquet/`
56
+ - `reviews.parquet`, `items_catalog.parquet`: Base product metadata
57
+ - `user_text_emb.parquet`: User text embedding vectors
58
+
59
+ ### `model/`
60
+ - `model.safetensors`, `adapter_model.safetensors`: LoRA fine-tuned weights
61
+
62
+ ### `faiss/`
63
+ - `items_beauty_concat.faiss`, `items_beauty_weighted.faiss`: FAISS indexes for fast item retrieval
64
+
65
+ ---
66
+
67
+ ## 🔌 Paired Spaces
68
+
69
+ - **API Backend (FastAPI):** [CoVE API](https://huggingface.co/spaces/mickey1976/cove-api)
70
+ - **UI Frontend (Gradio):** [CoVE UI](https://huggingface.co/spaces/mickey1976/cove-ui)
71
+
72
+ These Spaces dynamically fetch data from this dataset repo using `huggingface_hub`.
73
+
74
+ ---
75
+
76
+ ## 🐍 Example: Load Embeddings via `huggingface_hub`
77
+
78
+ ```python
79
+ from huggingface_hub import hf_hub_download
80
+ import numpy as np
81
+
82
+ # Load text embeddings
83
+ text_emb_path = hf_hub_download(
84
+ repo_id="mickey1976/mayankc-amazon_beauty_subset",
85
+ repo_type="dataset",
86
+ filename="npy/text.npy"
87
+ )
88
+
89
+ text_embeddings = np.load(text_emb_path)
90
+
91
+
92
+
93
+
94
+ 📖 Citation
95
+
96
+ Data originally from:
97
+
98
+ Ni, J., et al. (2019). Amazon Review Dataset. UCSD.
99
+ https://nijianmo.github.io/amazon/index.html
100
+
101
+ Used here in support of MMR-Agentic-CoVE multimodal recommender architecture.
102
+
103
+
104
+
105
+ 🛠 Maintained by
106
+
107
+ Mayank Choudhary
108
+ GitHub | Hugging Face
109
+
110
+ ---
111
+
112
+ ### ✅ Instructions to Save
113
+
114
+ 1. Overwrite the current `README.md` in your dataset root directory:
115
+ ```bash
116
+ nano README.md
117
+
118
+ (Paste the content above, save with Ctrl + O, exit with Ctrl + X)
119
+ 2. Commit and push:
120
+
121
+ git add README.md
122
+ git commit -m "Update README with folder structure and usage guide"
123
+ git push
124
+
125
+
126
+ Here is a shorter version of the README.md suitable for the Hugging Face dataset card view (top-level summary users see when browsing your dataset):
127
+
128
+
129
+
130
+
131
+ # 📦 Amazon Beauty Subset for MMR-Agentic-CoVE
132
+
133
+ This dataset powers the **MMR-Agentic-CoVE** recommender system and contains a compact, multimodal slice of the Amazon Beauty product data. It includes:
134
+
135
+ - ✅ JSON configs & sequences
136
+ - ✅ NPY embeddings (text, image, meta, CoVE)
137
+ - ✅ Parquet structured tables
138
+ - ✅ PEFT model weights (LoRA/adapter)
139
+ - ✅ FAISS indexes for fast retrieval
140
+
141
+ ## 🧭 Folder Structure
142
+
143
+ json/ → ID maps, defaults, sequences
144
+ npy/ → Embeddings & logits
145
+ parquet/ → Metadata & user-item tables
146
+ model/ → Fine-tuned model weights
147
+ faiss/ → Item FAISS indexes
148
+
149
+ ## 🔌 Paired Spaces
150
+
151
+ - **API Backend** → [CoVE API](https://huggingface.co/spaces/mickey1976/cove-api)
152
+ - **Gradio UI** → [CoVE UI](https://huggingface.co/spaces/mickey1976/cove-ui)
153
+
154
+ ## 📚 Citation
155
+
156
+ > Ni, J., et al. (2019). *Amazon Review Dataset*. UCSD.
157
+ > https://nijianmo.github.io/amazon/index.html
158
+
159
+ Maintained by [@mickey1976](https://huggingface.co/mickey1976)
160
+
161
+
162
+
faiss/items_beauty_concat.faiss ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dddf882840f818cabacf898461f00463b1c13fcc32656a3a98bb75e14c74c20b
3
+ size 331885
faiss/items_beauty_weighted.faiss ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2bdaed8eb3e45fc38cb8c79a87df75611fb22f1064bc80f437c7eddc72025bbd
3
+ size 174125
file_manifest.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "faiss": {
3
+ "concat": "faiss/items_beauty_concat.faiss",
4
+ "weighted": "faiss/items_beauty_weighted.faiss"
5
+ },
6
+ "defaults": "json/defaults.json",
7
+ "json": [
8
+ "json/reviews.json",
9
+ "json/meta.json",
10
+ "json/leave_one_out_test.json"
11
+ ],
12
+ "npy": [
13
+ "npy/text.npy",
14
+ "npy/image.npy",
15
+ "npy/meta.npy",
16
+ "npy/cove.npy"
17
+ ],
18
+ "parquet": [
19
+ "parquet/items_catalog.parquet",
20
+ "parquet/reviews.parquet"
21
+ ],
22
+ "model": {
23
+ "adapter": "model/adapter_model.safetensors",
24
+ "full": "model/model.safetensors"
25
+ }
26
+ }
json/config.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "model_type": "gpt2"
3
+ }
json/datamaps.json ADDED
@@ -0,0 +1,358 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "token_prefix": "<|I_",
3
+ "token_suffix": "|>",
4
+ "item_to_token": {
5
+ "B0009RF9DW": "<|I_B0009RF9DW|>",
6
+ "B000FI4S1E": "<|I_B000FI4S1E|>",
7
+ "B000URXP6E": "<|I_B000URXP6E|>",
8
+ "B0012Y0ZG2": "<|I_B0012Y0ZG2|>",
9
+ "B00W259T7G": "<|I_B00W259T7G|>",
10
+ "B000VV1YOY": "<|I_B000VV1YOY|>",
11
+ "B001LNODUS": "<|I_B001LNODUS|>",
12
+ "B019FWRG3C": "<|I_B019FWRG3C|>",
13
+ "B00006L9LC": "<|I_B00006L9LC|>",
14
+ "B001OHV1H4": "<|I_B001OHV1H4|>",
15
+ "B00VG1AV5Q": "<|I_B00VG1AV5Q|>",
16
+ "B01DLR9IDI": "<|I_B01DLR9IDI|>",
17
+ "B00CQ0LN80": "<|I_B00CQ0LN80|>",
18
+ "B00HLXEXDO": "<|I_B00HLXEXDO|>",
19
+ "B000X7ST9Y": "<|I_B000X7ST9Y|>",
20
+ "B001E5PLCM": "<|I_B001E5PLCM|>",
21
+ "B00DY59MB6": "<|I_B00DY59MB6|>",
22
+ "B00JF2GVWK": "<|I_B00JF2GVWK|>",
23
+ "B00L1I1VMG": "<|I_B00L1I1VMG|>",
24
+ "B0010ZBORW": "<|I_B0010ZBORW|>",
25
+ "B00QXW95Q4": "<|I_B00QXW95Q4|>",
26
+ "B006IB5T4W": "<|I_B006IB5T4W|>",
27
+ "B01E7UKR38": "<|I_B01E7UKR38|>",
28
+ "B002GP80EU": "<|I_B002GP80EU|>",
29
+ "B00N2WQ2IW": "<|I_B00N2WQ2IW|>",
30
+ "B00AKP21KM": "<|I_B00AKP21KM|>",
31
+ "B00RZYW4RG": "<|I_B00RZYW4RG|>",
32
+ "B000GLRREU": "<|I_B000GLRREU|>",
33
+ "B000FOI48G": "<|I_B000FOI48G|>",
34
+ "B016V8YWBC": "<|I_B016V8YWBC|>",
35
+ "B0013NB7DW": "<|I_B0013NB7DW|>",
36
+ "B019809F9Y": "<|I_B019809F9Y|>",
37
+ "B000NKJIXM": "<|I_B000NKJIXM|>",
38
+ "B00EYZY6LQ": "<|I_B00EYZY6LQ|>",
39
+ "B001E96LUO": "<|I_B001E96LUO|>",
40
+ "B000LIBUBY": "<|I_B000LIBUBY|>",
41
+ "B000VUXCGI": "<|I_B000VUXCGI|>",
42
+ "B00IJHY54S": "<|I_B00IJHY54S|>",
43
+ "B00MTR49IG": "<|I_B00MTR49IG|>",
44
+ "B00UWB35UY": "<|I_B00UWB35UY|>",
45
+ "B000PKKAGO": "<|I_B000PKKAGO|>",
46
+ "B0017TZD7S": "<|I_B0017TZD7S|>",
47
+ "B008YQM4A6": "<|I_B008YQM4A6|>",
48
+ "B019LAI4HU": "<|I_B019LAI4HU|>",
49
+ "B00CZH3K1C": "<|I_B00CZH3K1C|>",
50
+ "B00VARTPKS": "<|I_B00VARTPKS|>",
51
+ "B001ET7FZE": "<|I_B001ET7FZE|>",
52
+ "B01BNEYGQU": "<|I_B01BNEYGQU|>",
53
+ "B007R6UXNY": "<|I_B007R6UXNY|>",
54
+ "B004KEJ65C": "<|I_B004KEJ65C|>",
55
+ "B004CALFE4": "<|I_B004CALFE4|>",
56
+ "B001F51RAG": "<|I_B001F51RAG|>",
57
+ "B00NT0AR7E": "<|I_B00NT0AR7E|>",
58
+ "B00B9V9ASM": "<|I_B00B9V9ASM|>",
59
+ "B00112DRHY": "<|I_B00112DRHY|>",
60
+ "B019V2KYZS": "<|I_B019V2KYZS|>",
61
+ "B0002JHI1I": "<|I_B0002JHI1I|>",
62
+ "B0006O10P4": "<|I_B0006O10P4|>",
63
+ "B00120VWTK": "<|I_B00120VWTK|>",
64
+ "B01DKQAXC0": "<|I_B01DKQAXC0|>",
65
+ "B000V5Z4J6": "<|I_B000V5Z4J6|>",
66
+ "B000WYJTZG": "<|I_B000WYJTZG|>",
67
+ "B00157OBRU": "<|I_B00157OBRU|>",
68
+ "B0012XPRO8": "<|I_B0012XPRO8|>",
69
+ "B00GHJOM2U": "<|I_B00GHJOM2U|>",
70
+ "B002RZZXYE": "<|I_B002RZZXYE|>",
71
+ "B00MGK9Z8U": "<|I_B00MGK9Z8U|>",
72
+ "B000W0C07Y": "<|I_B000W0C07Y|>",
73
+ "B00021DJ32": "<|I_B00021DJ32|>",
74
+ "B006WYJM8Y": "<|I_B006WYJM8Y|>",
75
+ "B00EF1QRMU": "<|I_B00EF1QRMU|>",
76
+ "B00BSE3III": "<|I_B00BSE3III|>",
77
+ "B0000530HU": "<|I_B0000530HU|>",
78
+ "B000FTYALG": "<|I_B000FTYALG|>",
79
+ "B000YB70PS": "<|I_B000YB70PS|>",
80
+ "B00155Z6V2": "<|I_B00155Z6V2|>",
81
+ "B0091OCA86": "<|I_B0091OCA86|>",
82
+ "B00B7V273E": "<|I_B00B7V273E|>",
83
+ "B000WR2HB6": "<|I_B000WR2HB6|>",
84
+ "B0014SQQ3M": "<|I_B0014SQQ3M|>",
85
+ "B0011FYB5I": "<|I_B0011FYB5I|>",
86
+ "B001QY8QXM": "<|I_B001QY8QXM|>",
87
+ "B007V6JNE0": "<|I_B007V6JNE0|>",
88
+ "B000X2FPXC": "<|I_B000X2FPXC|>",
89
+ "B00126LYJM": "<|I_B00126LYJM|>"
90
+ },
91
+ "token_to_item": {
92
+ "<|I_B0009RF9DW|>": "B0009RF9DW",
93
+ "<|I_B000FI4S1E|>": "B000FI4S1E",
94
+ "<|I_B000URXP6E|>": "B000URXP6E",
95
+ "<|I_B0012Y0ZG2|>": "B0012Y0ZG2",
96
+ "<|I_B00W259T7G|>": "B00W259T7G",
97
+ "<|I_B000VV1YOY|>": "B000VV1YOY",
98
+ "<|I_B001LNODUS|>": "B001LNODUS",
99
+ "<|I_B019FWRG3C|>": "B019FWRG3C",
100
+ "<|I_B00006L9LC|>": "B00006L9LC",
101
+ "<|I_B001OHV1H4|>": "B001OHV1H4",
102
+ "<|I_B00VG1AV5Q|>": "B00VG1AV5Q",
103
+ "<|I_B01DLR9IDI|>": "B01DLR9IDI",
104
+ "<|I_B00CQ0LN80|>": "B00CQ0LN80",
105
+ "<|I_B00HLXEXDO|>": "B00HLXEXDO",
106
+ "<|I_B000X7ST9Y|>": "B000X7ST9Y",
107
+ "<|I_B001E5PLCM|>": "B001E5PLCM",
108
+ "<|I_B00DY59MB6|>": "B00DY59MB6",
109
+ "<|I_B00JF2GVWK|>": "B00JF2GVWK",
110
+ "<|I_B00L1I1VMG|>": "B00L1I1VMG",
111
+ "<|I_B0010ZBORW|>": "B0010ZBORW",
112
+ "<|I_B00QXW95Q4|>": "B00QXW95Q4",
113
+ "<|I_B006IB5T4W|>": "B006IB5T4W",
114
+ "<|I_B01E7UKR38|>": "B01E7UKR38",
115
+ "<|I_B002GP80EU|>": "B002GP80EU",
116
+ "<|I_B00N2WQ2IW|>": "B00N2WQ2IW",
117
+ "<|I_B00AKP21KM|>": "B00AKP21KM",
118
+ "<|I_B00RZYW4RG|>": "B00RZYW4RG",
119
+ "<|I_B000GLRREU|>": "B000GLRREU",
120
+ "<|I_B000FOI48G|>": "B000FOI48G",
121
+ "<|I_B016V8YWBC|>": "B016V8YWBC",
122
+ "<|I_B0013NB7DW|>": "B0013NB7DW",
123
+ "<|I_B019809F9Y|>": "B019809F9Y",
124
+ "<|I_B000NKJIXM|>": "B000NKJIXM",
125
+ "<|I_B00EYZY6LQ|>": "B00EYZY6LQ",
126
+ "<|I_B001E96LUO|>": "B001E96LUO",
127
+ "<|I_B000LIBUBY|>": "B000LIBUBY",
128
+ "<|I_B000VUXCGI|>": "B000VUXCGI",
129
+ "<|I_B00IJHY54S|>": "B00IJHY54S",
130
+ "<|I_B00MTR49IG|>": "B00MTR49IG",
131
+ "<|I_B00UWB35UY|>": "B00UWB35UY",
132
+ "<|I_B000PKKAGO|>": "B000PKKAGO",
133
+ "<|I_B0017TZD7S|>": "B0017TZD7S",
134
+ "<|I_B008YQM4A6|>": "B008YQM4A6",
135
+ "<|I_B019LAI4HU|>": "B019LAI4HU",
136
+ "<|I_B00CZH3K1C|>": "B00CZH3K1C",
137
+ "<|I_B00VARTPKS|>": "B00VARTPKS",
138
+ "<|I_B001ET7FZE|>": "B001ET7FZE",
139
+ "<|I_B01BNEYGQU|>": "B01BNEYGQU",
140
+ "<|I_B007R6UXNY|>": "B007R6UXNY",
141
+ "<|I_B004KEJ65C|>": "B004KEJ65C",
142
+ "<|I_B004CALFE4|>": "B004CALFE4",
143
+ "<|I_B001F51RAG|>": "B001F51RAG",
144
+ "<|I_B00NT0AR7E|>": "B00NT0AR7E",
145
+ "<|I_B00B9V9ASM|>": "B00B9V9ASM",
146
+ "<|I_B00112DRHY|>": "B00112DRHY",
147
+ "<|I_B019V2KYZS|>": "B019V2KYZS",
148
+ "<|I_B0002JHI1I|>": "B0002JHI1I",
149
+ "<|I_B0006O10P4|>": "B0006O10P4",
150
+ "<|I_B00120VWTK|>": "B00120VWTK",
151
+ "<|I_B01DKQAXC0|>": "B01DKQAXC0",
152
+ "<|I_B000V5Z4J6|>": "B000V5Z4J6",
153
+ "<|I_B000WYJTZG|>": "B000WYJTZG",
154
+ "<|I_B00157OBRU|>": "B00157OBRU",
155
+ "<|I_B0012XPRO8|>": "B0012XPRO8",
156
+ "<|I_B00GHJOM2U|>": "B00GHJOM2U",
157
+ "<|I_B002RZZXYE|>": "B002RZZXYE",
158
+ "<|I_B00MGK9Z8U|>": "B00MGK9Z8U",
159
+ "<|I_B000W0C07Y|>": "B000W0C07Y",
160
+ "<|I_B00021DJ32|>": "B00021DJ32",
161
+ "<|I_B006WYJM8Y|>": "B006WYJM8Y",
162
+ "<|I_B00EF1QRMU|>": "B00EF1QRMU",
163
+ "<|I_B00BSE3III|>": "B00BSE3III",
164
+ "<|I_B0000530HU|>": "B0000530HU",
165
+ "<|I_B000FTYALG|>": "B000FTYALG",
166
+ "<|I_B000YB70PS|>": "B000YB70PS",
167
+ "<|I_B00155Z6V2|>": "B00155Z6V2",
168
+ "<|I_B0091OCA86|>": "B0091OCA86",
169
+ "<|I_B00B7V273E|>": "B00B7V273E",
170
+ "<|I_B000WR2HB6|>": "B000WR2HB6",
171
+ "<|I_B0014SQQ3M|>": "B0014SQQ3M",
172
+ "<|I_B0011FYB5I|>": "B0011FYB5I",
173
+ "<|I_B001QY8QXM|>": "B001QY8QXM",
174
+ "<|I_B007V6JNE0|>": "B007V6JNE0",
175
+ "<|I_B000X2FPXC|>": "B000X2FPXC",
176
+ "<|I_B00126LYJM|>": "B00126LYJM"
177
+ },
178
+ "item2idx": {
179
+ "B0009RF9DW": 0,
180
+ "B000FI4S1E": 1,
181
+ "B000URXP6E": 2,
182
+ "B0012Y0ZG2": 3,
183
+ "B00W259T7G": 4,
184
+ "B000VV1YOY": 5,
185
+ "B001LNODUS": 6,
186
+ "B019FWRG3C": 7,
187
+ "B00006L9LC": 8,
188
+ "B001OHV1H4": 9,
189
+ "B00VG1AV5Q": 10,
190
+ "B01DLR9IDI": 11,
191
+ "B00CQ0LN80": 12,
192
+ "B00HLXEXDO": 13,
193
+ "B000X7ST9Y": 14,
194
+ "B001E5PLCM": 15,
195
+ "B00DY59MB6": 16,
196
+ "B00JF2GVWK": 17,
197
+ "B00L1I1VMG": 18,
198
+ "B0010ZBORW": 19,
199
+ "B00QXW95Q4": 20,
200
+ "B006IB5T4W": 21,
201
+ "B01E7UKR38": 22,
202
+ "B002GP80EU": 23,
203
+ "B00N2WQ2IW": 24,
204
+ "B00AKP21KM": 25,
205
+ "B00RZYW4RG": 26,
206
+ "B000GLRREU": 27,
207
+ "B000FOI48G": 28,
208
+ "B016V8YWBC": 29,
209
+ "B0013NB7DW": 30,
210
+ "B019809F9Y": 31,
211
+ "B000NKJIXM": 32,
212
+ "B00EYZY6LQ": 33,
213
+ "B001E96LUO": 34,
214
+ "B000LIBUBY": 35,
215
+ "B000VUXCGI": 36,
216
+ "B00IJHY54S": 37,
217
+ "B00MTR49IG": 38,
218
+ "B00UWB35UY": 39,
219
+ "B000PKKAGO": 40,
220
+ "B0017TZD7S": 41,
221
+ "B008YQM4A6": 42,
222
+ "B019LAI4HU": 43,
223
+ "B00CZH3K1C": 44,
224
+ "B00VARTPKS": 45,
225
+ "B001ET7FZE": 46,
226
+ "B01BNEYGQU": 47,
227
+ "B007R6UXNY": 48,
228
+ "B004KEJ65C": 49,
229
+ "B004CALFE4": 50,
230
+ "B001F51RAG": 51,
231
+ "B00NT0AR7E": 52,
232
+ "B00B9V9ASM": 53,
233
+ "B00112DRHY": 54,
234
+ "B019V2KYZS": 55,
235
+ "B0002JHI1I": 56,
236
+ "B0006O10P4": 57,
237
+ "B00120VWTK": 58,
238
+ "B01DKQAXC0": 59,
239
+ "B000V5Z4J6": 60,
240
+ "B000WYJTZG": 61,
241
+ "B00157OBRU": 62,
242
+ "B0012XPRO8": 63,
243
+ "B00GHJOM2U": 64,
244
+ "B002RZZXYE": 65,
245
+ "B00MGK9Z8U": 66,
246
+ "B000W0C07Y": 67,
247
+ "B00021DJ32": 68,
248
+ "B006WYJM8Y": 69,
249
+ "B00EF1QRMU": 70,
250
+ "B00BSE3III": 71,
251
+ "B0000530HU": 72,
252
+ "B000FTYALG": 73,
253
+ "B000YB70PS": 74,
254
+ "B00155Z6V2": 75,
255
+ "B0091OCA86": 76,
256
+ "B00B7V273E": 77,
257
+ "B000WR2HB6": 78,
258
+ "B0014SQQ3M": 79,
259
+ "B0011FYB5I": 80,
260
+ "B001QY8QXM": 81,
261
+ "B007V6JNE0": 82,
262
+ "B000X2FPXC": 83,
263
+ "B00126LYJM": 84
264
+ },
265
+ "idx2item": {
266
+ "0": "B0009RF9DW",
267
+ "1": "B000FI4S1E",
268
+ "2": "B000URXP6E",
269
+ "3": "B0012Y0ZG2",
270
+ "4": "B00W259T7G",
271
+ "5": "B000VV1YOY",
272
+ "6": "B001LNODUS",
273
+ "7": "B019FWRG3C",
274
+ "8": "B00006L9LC",
275
+ "9": "B001OHV1H4",
276
+ "10": "B00VG1AV5Q",
277
+ "11": "B01DLR9IDI",
278
+ "12": "B00CQ0LN80",
279
+ "13": "B00HLXEXDO",
280
+ "14": "B000X7ST9Y",
281
+ "15": "B001E5PLCM",
282
+ "16": "B00DY59MB6",
283
+ "17": "B00JF2GVWK",
284
+ "18": "B00L1I1VMG",
285
+ "19": "B0010ZBORW",
286
+ "20": "B00QXW95Q4",
287
+ "21": "B006IB5T4W",
288
+ "22": "B01E7UKR38",
289
+ "23": "B002GP80EU",
290
+ "24": "B00N2WQ2IW",
291
+ "25": "B00AKP21KM",
292
+ "26": "B00RZYW4RG",
293
+ "27": "B000GLRREU",
294
+ "28": "B000FOI48G",
295
+ "29": "B016V8YWBC",
296
+ "30": "B0013NB7DW",
297
+ "31": "B019809F9Y",
298
+ "32": "B000NKJIXM",
299
+ "33": "B00EYZY6LQ",
300
+ "34": "B001E96LUO",
301
+ "35": "B000LIBUBY",
302
+ "36": "B000VUXCGI",
303
+ "37": "B00IJHY54S",
304
+ "38": "B00MTR49IG",
305
+ "39": "B00UWB35UY",
306
+ "40": "B000PKKAGO",
307
+ "41": "B0017TZD7S",
308
+ "42": "B008YQM4A6",
309
+ "43": "B019LAI4HU",
310
+ "44": "B00CZH3K1C",
311
+ "45": "B00VARTPKS",
312
+ "46": "B001ET7FZE",
313
+ "47": "B01BNEYGQU",
314
+ "48": "B007R6UXNY",
315
+ "49": "B004KEJ65C",
316
+ "50": "B004CALFE4",
317
+ "51": "B001F51RAG",
318
+ "52": "B00NT0AR7E",
319
+ "53": "B00B9V9ASM",
320
+ "54": "B00112DRHY",
321
+ "55": "B019V2KYZS",
322
+ "56": "B0002JHI1I",
323
+ "57": "B0006O10P4",
324
+ "58": "B00120VWTK",
325
+ "59": "B01DKQAXC0",
326
+ "60": "B000V5Z4J6",
327
+ "61": "B000WYJTZG",
328
+ "62": "B00157OBRU",
329
+ "63": "B0012XPRO8",
330
+ "64": "B00GHJOM2U",
331
+ "65": "B002RZZXYE",
332
+ "66": "B00MGK9Z8U",
333
+ "67": "B000W0C07Y",
334
+ "68": "B00021DJ32",
335
+ "69": "B006WYJM8Y",
336
+ "70": "B00EF1QRMU",
337
+ "71": "B00BSE3III",
338
+ "72": "B0000530HU",
339
+ "73": "B000FTYALG",
340
+ "74": "B000YB70PS",
341
+ "75": "B00155Z6V2",
342
+ "76": "B0091OCA86",
343
+ "77": "B00B7V273E",
344
+ "78": "B000WR2HB6",
345
+ "79": "B0014SQQ3M",
346
+ "80": "B0011FYB5I",
347
+ "81": "B001QY8QXM",
348
+ "82": "B007V6JNE0",
349
+ "83": "B000X2FPXC",
350
+ "84": "B00126LYJM"
351
+ },
352
+ "num_items": 85,
353
+ "num_users": 991,
354
+ "min_seq_len": 2,
355
+ "max_seq_len": 200,
356
+ "seq_count": 991,
357
+ "avg_seq_len": 5.316851664984863
358
+ }
json/defaults.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "concat": {
3
+ "w_text": 1.0,
4
+ "w_image": 0.2,
5
+ "w_meta": 0.2,
6
+ "k": 10,
7
+ "faiss_name": "beauty_concat"
8
+ },
9
+ "weighted": {
10
+ "w_text": 1.0,
11
+ "w_image": 0.2,
12
+ "w_meta": 0.2,
13
+ "k": 10,
14
+ "faiss_name": "beauty_weighted"
15
+ }
16
+ }
json/full_item_ids.json ADDED
@@ -0,0 +1 @@
 
 
1
+ ["0", "1", "2", "3", "4", "5", "6", "7", "8", "9"]
json/logits_reranked.json ADDED
The diff for this file is too large to render. See raw diff
 
json/meta.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4f1d78c87190967a62449ee56c8b56d81b9ecce8544f290826328ae5e04d933e
3
+ size 66409104
json/reviews.json ADDED
The diff for this file is too large to render. See raw diff
 
json/tokenizer_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "tokenizer_class": "GPT2Tokenizer",
3
+ "eos_token": "",
4
+ "bos_token": "",
5
+ "unk_token": "",
6
+ "pad_token": ""
7
+ }
json/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
npy/cove_logits.npy ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7e70734be52078915ffd6d303d3391e9fccc0ff6194bc333dc4cd14be1dce24d
3
+ size 199555816
npy/full_cove_embeddings.npy ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c4d436b13771cd276008abb39c90a370d7ae875992ab5b84a2fd8a9b0d8fa2c
3
+ size 261248
npy/logits.npy ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6242dad1166ac77977b88325c5c3abb637c1d6e987c50cdc64f00ca9b3a25116
3
+ size 3044480