webxos commited on
Commit
3a9b8fd
·
verified ·
1 Parent(s): 4f22ac7

Upload 6 files

Browse files
Files changed (6) hide show
  1. README.md +109 -0
  2. dataset_card.md +99 -0
  3. load_dataset.py +45 -0
  4. metadata.json +21 -0
  5. test.jsonl +0 -0
  6. train.jsonl +0 -0
README.md ADDED
@@ -0,0 +1,109 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ task_categories:
5
+ - matrix-computation
6
+ - synthetic-data-generation
7
+ tags:
8
+ - matrix-operations
9
+ - synthetic-data
10
+ - machine-learning
11
+ - mathematics
12
+ license: apache-2.0
13
+ dataset_info:
14
+ features:
15
+ - name: id
16
+ dtype: string
17
+ - name: timestamp
18
+ dtype: string
19
+ - name: matrix_size
20
+ dtype: int32
21
+ - name: operations
22
+ list:
23
+ - name: type
24
+ dtype: string
25
+ - name: time_ms
26
+ dtype: float32
27
+ - name: matrix_a
28
+ sequence: float32
29
+ - name: matrix_b
30
+ sequence: float32
31
+ - name: matrix
32
+ sequence: float32
33
+ - name: result
34
+ sequence: float32
35
+ - name: error
36
+ dtype: string
37
+ splits:
38
+ - name: train
39
+ num_bytes: 3097616
40
+ num_examples: 400
41
+ - name: test
42
+ num_bytes: 774404
43
+ num_examples: 100
44
+ download_size: 3872020
45
+ dataset_size: 3872020
46
+ pretty_name: "matrix_operations"
47
+ size_categories:
48
+ - n<1K
49
+ ---
50
+
51
+ # matrix_operations
52
+
53
+ Synthetic matrix operations dataset for ML training
54
+
55
+ ## Dataset Details
56
+
57
+ - **Generated:** 2026-01-05T22:20:06.264Z
58
+ - **Total Samples:** 500
59
+ - **Splits:** Train (400), Test (100)
60
+ - **Matrix Size:** 8×8
61
+ - **Operations:** matmul, add
62
+ - **Backend:** WEBGL
63
+ - **Format:** jsonl
64
+
65
+ ## Usage
66
+
67
+ ```python
68
+ from datasets import load_dataset
69
+
70
+ # Load the dataset
71
+ dataset = load_dataset("matrix_operations")
72
+
73
+ # Access train and test splits
74
+ train_dataset = dataset["train"]
75
+ test_dataset = dataset["test"]
76
+ ```
77
+
78
+ ## Example
79
+
80
+ ```python
81
+ import datasets
82
+
83
+ # Load dataset
84
+ ds = datasets.load_dataset("matrix_operations")
85
+
86
+ # Get first example
87
+ example = ds["train"][0]
88
+ print(f"ID: {example['id']}")
89
+ print(f"Matrix Size: {example['matrix_size']}")
90
+ print(f"Operations: {len(example['operations'])}")
91
+ ```
92
+
93
+ ## Citation
94
+
95
+ If you use this dataset in research, please cite:
96
+
97
+ ```bibtex
98
+ @dataset{matrix_operations_2026,
99
+ title = {matrix_operations},
100
+ author = {Generated by HF Dataset Generator v2.2},
101
+ year = {2026},
102
+ publisher = {Hugging Face},
103
+ url = {https://huggingface.co/datasets/your-username/matrix_operations}
104
+ }
105
+ ```
106
+
107
+ ## License
108
+
109
+ apache-2.0
dataset_card.md ADDED
@@ -0,0 +1,99 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Dataset Card for matrix_operations
2
+
3
+ ## Dataset Description
4
+
5
+ - **Homepage:** [Add homepage URL if available]
6
+ - **Repository:** [Add repository URL]
7
+ - **Point of Contact:** [Add contact name/email]
8
+
9
+ ### Dataset Summary
10
+
11
+ Synthetic matrix operations dataset for ML training
12
+
13
+ This dataset was automatically generated using the HF Dataset Generator v2.2 with TensorFlow.js backend (webgl).
14
+
15
+ ### Supported Tasks
16
+
17
+ - Matrix operation prediction
18
+ - Computational performance benchmarking
19
+ - Synthetic data for ML training
20
+ - Algorithm validation and testing
21
+
22
+ ### Languages
23
+
24
+ English
25
+
26
+ ## Dataset Structure
27
+
28
+ ### Data Instances
29
+
30
+ Each instance contains:
31
+ - Unique sample ID
32
+ - Generation timestamp
33
+ - Matrix size (n×n)
34
+ - List of operations performed with:
35
+ - Operation type
36
+ - Execution time in milliseconds
37
+ - Input matrices
38
+ - Result matrices
39
+ - Error messages (if any)
40
+
41
+ ### Data Fields
42
+
43
+ - `id`: Unique identifier (string)
44
+ - `timestamp`: Generation timestamp (string)
45
+ - `matrix_size`: Dimension of matrices (int32)
46
+ - `operations`: List of operations performed (list of dicts)
47
+
48
+ ### Data Splits
49
+
50
+ - **Train:** 400 samples
51
+ - **Test:** 100 samples
52
+
53
+ ## Dataset Creation
54
+
55
+ ### Curation Rationale
56
+
57
+ This dataset was created to provide synthetic matrix operation data for machine learning research, benchmarking computational kernels, and testing numerical algorithms.
58
+
59
+ ### Source Data
60
+
61
+ Synthetically generated using TensorFlow.js matrix operations with random normal distributions.
62
+
63
+ ### Annotations
64
+
65
+ No human annotations.
66
+
67
+ ### Personal and Sensitive Information
68
+
69
+ None. All data is synthetically generated.
70
+
71
+ ## Considerations for Using the Data
72
+
73
+ ### Social Impact
74
+
75
+ This dataset enables research in computational mathematics, machine learning optimization, and numerical analysis education.
76
+
77
+ ### Discussion of Biases
78
+
79
+ Matrices are randomly generated from normal distributions (mean=0, std=1). Real-world matrices may have different distributions.
80
+
81
+ ### Other Known Limitations
82
+
83
+ 1. Matrix inverse may fail for singular matrices
84
+ 2. Performance timing varies by hardware (webgl backend)
85
+ 3. Limited to square matrices
86
+
87
+ ## Additional Information
88
+
89
+ ### Dataset Curators
90
+
91
+ Generated automatically by HF Dataset Generator v2.2
92
+
93
+ ### Licensing Information
94
+
95
+ apache-2.0 License
96
+
97
+ ### Contributions
98
+
99
+ Thanks to TensorFlow.js and Hugging Face communities.
load_dataset.py ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ """
3
+ Script to load and verify the matrix_operations dataset
4
+ """
5
+
6
+ import json
7
+ from pathlib import Path
8
+
9
+ def load_and_verify_dataset():
10
+ dataset_path = Path(".")
11
+
12
+ print(f"Loading {config.name} dataset...")
13
+
14
+ # Load train split
15
+ train_data = []
16
+ if (dataset_path / "train.jsonl").exists():
17
+ with open(dataset_path / "train.jsonl", "r") as f:
18
+ for line in f:
19
+ train_data.append(json.loads(line))
20
+ print(f"Loaded {len(train_data)} train samples")
21
+
22
+ # Load test split
23
+ test_data = []
24
+ if (dataset_path / "test.jsonl").exists():
25
+ with open(dataset_path / "test.jsonl", "r") as f:
26
+ for line in f:
27
+ test_data.append(json.loads(line))
28
+ print(f"Loaded {len(test_data)} test samples")
29
+
30
+ # Basic validation
31
+ print("\nDataset Validation:")
32
+ print(f"Total samples: {len(train_data) + len(test_data)}")
33
+
34
+ if train_data:
35
+ sample = train_data[0]
36
+ print(f"Sample keys: {list(sample.keys())}")
37
+ print(f"Matrix size: {sample.get('matrix_size')}")
38
+ print(f"Operations count: {len(sample.get('operations', []))}")
39
+
40
+ print("\nDataset ready for use!")
41
+ print("To upload to Hugging Face Hub:")
42
+ print(f" git push https://huggingface.co/datasets/your-username/{config.name}")
43
+
44
+ if __name__ == "__main__":
45
+ load_and_verify_dataset()
metadata.json ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "name": "matrix_operations",
3
+ "description": "Synthetic matrix operations dataset for ML training",
4
+ "license": "apache-2.0",
5
+ "format": "jsonl",
6
+ "generated_at": "2026-01-05T22:19:45.610Z",
7
+ "operations": [
8
+ "matmul",
9
+ "add"
10
+ ],
11
+ "matrix_size": 8,
12
+ "backend": "webgl",
13
+ "splits": {
14
+ "train": 400,
15
+ "test": 100
16
+ },
17
+ "samples": 500,
18
+ "train_samples": 400,
19
+ "test_samples": 100,
20
+ "size_kb": "3781.27"
21
+ }
test.jsonl ADDED
The diff for this file is too large to render. See raw diff
 
train.jsonl ADDED
The diff for this file is too large to render. See raw diff