PaczkiLives commited on
Commit
127d970
·
verified ·
1 Parent(s): ae3f841

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +86 -0
README.md ADDED
@@ -0,0 +1,86 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ task_categories:
4
+ - feature-extraction
5
+ - text-retrieval
6
+ language:
7
+ - en
8
+ pretty_name: Daemon Wiki FAISS Index
9
+ size_categories:
10
+ - 10M<n<100M
11
+ tags:
12
+ - faiss
13
+ - wikipedia
14
+ - embeddings
15
+ - retrieval
16
+ ---
17
+
18
+ # Daemon Wiki FAISS Index
19
+
20
+ Pre-built FAISS IVFPQ index and metadata for the [Daemon](https://github.com/lukehalleran/Daemon) conversational RAG system.
21
+
22
+ ## Contents
23
+
24
+ | File | Size | Description |
25
+ |------|------|-------------|
26
+ | `vector_index_ivf.faiss` | ~2.2 GB | FAISS IVFPQ index (48 subquantizers x 8 bits, ~32x compression) |
27
+ | `metadata.parquet` | ~12 GB | Row-group metadata (titles, text, timestamps) for zero-copy lookup |
28
+
29
+ **Coverage:** ~41 million vectors from 6.5M+ English Wikipedia articles, embedded with `sentence-transformers/all-MiniLM-L6-v2` (384-dim).
30
+
31
+ ## Usage
32
+
33
+ ### Download
34
+
35
+ ```bash
36
+ pip install huggingface_hub
37
+
38
+ # Download both files into a local directory
39
+ huggingface-cli download PaczkiLives/daemon-wiki-faiss \
40
+ --repo-type dataset \
41
+ --local-dir ~/daemon-wiki-data/wiki_data
42
+ ```
43
+
44
+ ### Point Daemon at the data
45
+
46
+ Set `WIKI_DATA_ROOT` to the **parent** directory of `wiki_data/`:
47
+
48
+ ```bash
49
+ # If you downloaded to ~/daemon-wiki-data/wiki_data/
50
+ export WIKI_DATA_ROOT=~/daemon-wiki-data
51
+
52
+ # Then launch Daemon
53
+ python main.py
54
+ ```
55
+
56
+ Or set individual paths directly:
57
+
58
+ ```bash
59
+ export FAISS_INDEX_PATH=/path/to/wiki_data/vector_index_ivf.faiss
60
+ export FAISS_META_PATH=/path/to/wiki_data/metadata.parquet
61
+ ```
62
+
63
+ ### Runtime requirements
64
+
65
+ - **RAM:** ~2.6 GB (2.2 GB FAISS index + 0.4 GB embedding model). Metadata is read on-demand via zero-copy parquet row-group access — no DataFrame loaded into memory.
66
+ - **Disk:** ~14.5 GB for both files.
67
+ - **CPU:** Works on CPU. No GPU required.
68
+
69
+ ## How it was built
70
+
71
+ ```bash
72
+ # From the Daemon repo:
73
+ python scripts/build_faiss_index.py
74
+ ```
75
+
76
+ The build pipeline:
77
+ 1. Downloads the latest English Wikipedia dump (~22 GB compressed)
78
+ 2. Parses XML, extracts article text
79
+ 3. Chunks articles at ~512 tokens with header-aware splitting
80
+ 4. Embeds chunks with `all-MiniLM-L6-v2` (384 dimensions)
81
+ 5. Trains an IVF4096,PQ48 index on a sample, then adds all vectors
82
+ 6. Writes metadata to a partitioned parquet file for zero-copy reads
83
+
84
+ ## License
85
+
86
+ MIT — same as the Daemon project.