adamjuhasz commited on
Commit
3b048bf
·
verified ·
1 Parent(s): c86fb83

Update dataset card

Browse files
Files changed (1) hide show
  1. README.md +57 -27
README.md CHANGED
@@ -10,34 +10,12 @@ configs:
10
  - config_name: default
11
  data_files:
12
  - split: train
13
- path: data/train-*
14
  tags:
15
  - simple-wikipedia
16
  - wikipedia
17
  - markdown
18
  - sqlite
19
- dataset_info:
20
- features:
21
- - name: page_id
22
- dtype: int64
23
- - name: title
24
- dtype: string
25
- - name: content
26
- dtype: string
27
- - name: content_no_link
28
- dtype: string
29
- - name: importance
30
- dtype: string
31
- - name: truncated
32
- dtype: bool
33
- - name: error
34
- dtype: bool
35
- splits:
36
- - name: train
37
- num_bytes: 797840092
38
- num_examples: 273522
39
- download_size: 432389310
40
- dataset_size: 797840092
41
  ---
42
 
43
  # Simple English Wikipedia (Markdown)
@@ -52,6 +30,23 @@ Recurring weekly snapshot of Simple English Wikipedia (https://simple.wikipedia.
52
 
53
  ## Dataset Structure
54
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
55
  Columns:
56
  - `page_id` (int64): Unique page identifier from Wikimedia dump.
57
  - `title` (string): Article title.
@@ -61,13 +56,38 @@ Columns:
61
  - `truncated` (bool): `true` when the source article exceeded 40,000 characters; only the first two paragraphs are parsed and stored in this case; otherwise `false`.
62
  - `error` (bool): `true` when the article could not be parsed (content fields are empty in this case); otherwise `false`.
63
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
64
  ## Processing
65
 
66
  - Downloaded `pages-articles` XML dump and verified SHA-1.
67
  - Kept only namespace 0 articles, skipped redirects, and dropped titles beginning with “List of”.
68
  - Stripped templates/ref/gallery blocks and file/category links; converted headings, lists, tables, and internal/external links to Markdown with page IDs.
69
- - Stored a SQLite mirror (`pages` table) alongside the Hugging Face dataset.
70
- - Markdown links point to the target page's numeric ID for fast lookup without a title-to-ID join.
71
 
72
  ## Usage
73
 
@@ -80,10 +100,13 @@ ds = load_dataset("juno-labs/simple_wikipedia", split="train")
80
  print(ds[0])
81
  ```
82
 
83
- SQLite usage (`simplewiki.sqlite` mirrors the same columns):
84
 
85
  ```bash
86
  sqlite3 simplewiki.sqlite "SELECT page_id, title, substr(content,1,200) || '...' FROM pages LIMIT 5;"
 
 
 
87
  ```
88
 
89
  You can also mount it in code:
@@ -99,7 +122,14 @@ for row in cur.execute("SELECT title, content FROM pages WHERE page_id = ?", (75
99
 
100
  ## Categorization
101
 
102
- - Model: openai/gpt-5-mini
 
 
 
 
 
 
 
103
 
104
  Prompt template:
105
  ```
 
10
  - config_name: default
11
  data_files:
12
  - split: train
13
+ path: data/train-*.parquet
14
  tags:
15
  - simple-wikipedia
16
  - wikipedia
17
  - markdown
18
  - sqlite
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19
  ---
20
 
21
  # Simple English Wikipedia (Markdown)
 
30
 
31
  ## Dataset Structure
32
 
33
+ - Hugging Face split: `train`
34
+ - SQLite mirror: `simplewiki.sqlite` (contains `pages` and `infobox` tables)
35
+
36
+ ## Hugging Face Dataset Columns (train split)
37
+
38
+ - `page_id` (int64): Unique page identifier from Wikimedia dump.
39
+ - `title` (string): Article title.
40
+ - `content` (string): Article body converted to markdown with internal and external links preserved.
41
+ - `content_no_link` (string): Same content with markdown links stripped to plain text.
42
+ - `importance` (string): Importance for a household smart speaker assistant (`low`, `medium`, `high`, or `unknown` when not categorized).
43
+ - `truncated` (bool): `true` when the source article exceeded 40,000 characters; only the first two paragraphs are parsed and stored in this case; otherwise `false`.
44
+ - `error` (bool): `true` when the article could not be parsed (content fields are empty in this case); otherwise `false`.
45
+
46
+ ## SQLite Database
47
+
48
+ ### `pages` table
49
+
50
  Columns:
51
  - `page_id` (int64): Unique page identifier from Wikimedia dump.
52
  - `title` (string): Article title.
 
56
  - `truncated` (bool): `true` when the source article exceeded 40,000 characters; only the first two paragraphs are parsed and stored in this case; otherwise `false`.
57
  - `error` (bool): `true` when the article could not be parsed (content fields are empty in this case); otherwise `false`.
58
 
59
+ Example:
60
+
61
+ ```bash
62
+ sqlite3 simplewiki.sqlite "SELECT page_id, title, substr(content,1,200) || '...' AS preview FROM pages ORDER BY page_id LIMIT 5;"
63
+ ```
64
+
65
+ ### `infobox` table
66
+
67
+ Columns:
68
+ - `page_id` (int64): Page ID of the article the infobox belongs to.
69
+ - `article_title` (string): Title of the article.
70
+ - `infobox_type` (string): Template type (e.g., `country`, `person`, `airport`).
71
+ - `label` (string): Raw parameter name from the infobox template.
72
+ - `value` (string): Cleaned text value with wiki links flattened to plain text.
73
+
74
+ Examples:
75
+
76
+ ```bash
77
+ # All fields for Australia
78
+ sqlite3 simplewiki.sqlite "SELECT label, value FROM infobox WHERE page_id = 27 ORDER BY label LIMIT 10;"
79
+
80
+ # Join back to article text
81
+ sqlite3 simplewiki.sqlite "SELECT p.title, i.label, i.value FROM infobox i JOIN pages p ON p.page_id = i.page_id WHERE i.infobox_type = 'country' AND i.label = 'capital' LIMIT 5;"
82
+ ```
83
+
84
  ## Processing
85
 
86
  - Downloaded `pages-articles` XML dump and verified SHA-1.
87
  - Kept only namespace 0 articles, skipped redirects, and dropped titles beginning with “List of”.
88
  - Stripped templates/ref/gallery blocks and file/category links; converted headings, lists, tables, and internal/external links to Markdown with page IDs.
89
+ - Stored a SQLite mirror with both `pages` and `infobox` tables; infobox templates are flattened into rows with cleaned text values.
90
+ - Markdown links point to the target page's numeric ID for fast lookup without a title-to-ID join; infobox values have links stripped to plain text for ease of use.
91
 
92
  ## Usage
93
 
 
100
  print(ds[0])
101
  ```
102
 
103
+ SQLite usage (`simplewiki.sqlite` mirrors the same columns plus `infobox`):
104
 
105
  ```bash
106
  sqlite3 simplewiki.sqlite "SELECT page_id, title, substr(content,1,200) || '...' FROM pages LIMIT 5;"
107
+
108
+ # Infobox lookup example (Australia):
109
+ sqlite3 simplewiki.sqlite "SELECT label, value FROM infobox WHERE page_id = 27 ORDER BY label LIMIT 5;"
110
  ```
111
 
112
  You can also mount it in code:
 
122
 
123
  ## Categorization
124
 
125
+ Importance labels indicate how useful an article is for day-to-day offline household smart-speaker queries; `unknown` is used when labeling is disabled or fails.
126
+
127
+ - Model: openai/gpt-oss-120b
128
+ - distribution:
129
+ - low: 67.78%
130
+ - medium: 29.85%
131
+ - high: 2.35%
132
+ - unknown: 0.02%
133
 
134
  Prompt template:
135
  ```