File size: 2,912 Bytes
653b2d6
 
0098dd7
653b2d6
 
 
 
 
 
 
0098dd7
 
 
19bdfbd
7c091a1
 
 
653b2d6
 
0098dd7
653b2d6
7c091a1
3ca378b
653b2d6
0098dd7
653b2d6
3e3a8b8
7c091a1
 
 
3e3a8b8
 
7c091a1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3e3a8b8
7c091a1
 
 
 
 
 
 
 
 
 
 
 
 
 
3ca378b
7c091a1
 
 
3e3a8b8
 
653b2d6
 
0098dd7
653b2d6
0098dd7
653b2d6
0098dd7
 
 
 
 
3ca378b
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
---
license: mit
pretty_name: Metamath Proof Graphs (10k)
task_categories:
- graph-ml
tags:
- graphs
- gnn
- metamath
- pytorch-geometric
- topobench
size_categories:
- 10K<n<100K
dataset_summary: >
  A graph-based dataset of 10,000 Metamath theorems and their 10,000
  corresponding proof DAGs, including CodeBERT node embeddings,
  conclusion masking, rare-label collapsing, and fixed train/val/test splits.
---

# Metamath Proof Graphs (10k)

This repository provides a PyTorch Geometric dataset designed for the TAG-DS TopoBench challenge.  
It contains **20,000 graphs total:** 10,000 theorem-only DAGs and 10,000 full proof DAGs drawn from the first 10k theorems in the Metamath [1] database.

## Contents

- **`data.pt`**  
  A preprocessed PyG dataset containing:
  - `data` — global collated storage of all nodes, edges, and labels  
  - `slices` — pointers for reconstructing individual graphs  
  - `train_idx`, `val_idx`, `test_idx` — fixed graph-level splits

---

## Dataset Structure

### **1. Theorem Graphs (indices 0–9,999)**  
Each theorem is represented as a small DAG consisting only of:
- its hypothesis nodes  
- its conclusion node  
- **no proof steps**

These encode the *statement only*, not the derivation.

### **2. Proof Graphs (indices 10,000–19,999)**  
For each of the same theorems, the full proof DAG is included, containing:
- hypothesis nodes  
- intermediate proof steps  
- the same conclusion node  

Thus each theorem appears **twice**:
1. once as a theorem-only graph  
2. once as the complete proof of that theorem  

This pairing enables:
- learning from theorem statements  
- evaluating on masked proof conclusions  
- consistent label space across both halves

---

## Additional Details

- Total graphs: **20,000**  
- Node embeddings: **768-dimensional CodeBERT** vectors  
- Graph type: **directed acyclic graphs (DAGs)**  
- Label space: **3,557 justification labels**, where all labels with <5 training occurrences are collapsed into `UNK`  
- **Conclusion masking:** the conclusion node’s embedding is zeroed out; the model must infer its label from the structure and other nodes  
- **Monotonicity constraint:** in Metamath, proofs only use theorems with index <= the current theorem, so later theorems never appear in earlier graphs  
- Theorem-only graphs are included in training as prior knowledge for downstream proof prediction.

---

## Basic Usage

```python
import torch

obj = torch.load("data.pt", weights_only=False)

data      = obj["data"]
slices    = obj["slices"]
train_idx = obj["train_idx"]
val_idx   = obj["val_idx"]
test_idx  = obj["test_idx"]
```

---

## Acknowledgements

Thanks to the Erdős Institute for providing the project-based, collaborative
environment where key components of the preprocessing pipeline were first
developed.

---

## References
[1] Metamath Official Site — <https://us.metamath.org/index.html>