seanhacks commited on
Commit
8898e2d
·
verified ·
1 Parent(s): 0acc1e0

Publish NapistuDataStore:

Browse files
Files changed (1) hide show
  1. README.md +23 -39
README.md CHANGED
@@ -17,10 +17,10 @@ This store was created from GCS asset: **human_consensus_no_rxns** (version: **2
17
  ### NapistuData (1)
18
  - `relation_prediction`
19
 
20
- ### VertexTensors (1)
21
  - `comprehensive_pathway_memberships`
22
 
23
- ### pandas DataFrames (5)
24
  - `edge_strata_by_node_species_type`
25
  - `edge_strata_by_edge_sbo_terms`
26
  - `species_identifiers`
@@ -50,48 +50,26 @@ napistu_data = store.load_napistu_data("relation_prediction")
50
 
51
  ### Configure DataConfig
52
 
53
- You can also use this dataset in your `DataConfig` for PyTorch Lightning experiments:
54
 
55
- ```python
56
- from napistu_torch.configs import DataConfig
57
- from pathlib import Path
58
-
59
- # Configure DataConfig to load from HuggingFace Hub
60
- config = DataConfig(
61
- store_dir=Path("./local_store"),
62
- hf_repo_id="seanhacks/relation_prediction",
63
- hf_revision="main",
64
- napistu_data_name="relation_prediction",
65
- )
66
-
67
- # Use with NapistuDataStore.from_config()
68
- from napistu_torch.napistu_data_store import NapistuDataStore
69
- store = NapistuDataStore.from_config(config)
70
  ```
71
 
72
  To make the store writable (non-read-only), provide paths to the raw data files:
73
 
74
- ```python
75
- from napistu_torch.configs import DataConfig
76
- from pathlib import Path
77
-
78
- # Configure DataConfig to load from HuggingFace Hub and enable artifact creation
79
- config = DataConfig(
80
- store_dir=Path("./local_store"),
81
- hf_repo_id="seanhacks/relation_prediction",
82
- hf_revision="main",
83
- sbml_dfs_path=Path("/path/to/sbml_dfs.pkl"),
84
- napistu_graph_path=Path("/path/to/napistu_graph.pkl"),
85
- napistu_data_name="relation_prediction",
86
- )
87
-
88
- # Use with NapistuDataStore.from_config()
89
- # This will load from HF and convert to non-read-only automatically
90
- from napistu_torch.napistu_data_store import NapistuDataStore
91
- store = NapistuDataStore.from_config(config)
92
-
93
- # Now you can create new artifacts
94
- store.ensure_artifacts(["new_artifact_name"])
95
  ```
96
 
97
  ### Load Raw Data from GCS (Optional)
@@ -134,6 +112,12 @@ with tempfile.TemporaryDirectory() as temp_data_dir:
134
  store.ensure_artifacts(["new_artifact_name"])
135
  ```
136
 
 
 
 
 
 
 
137
  ## Citation
138
 
139
  If you use this dataset, please cite:
 
17
  ### NapistuData (1)
18
  - `relation_prediction`
19
 
20
+ ### VertexTensor (1)
21
  - `comprehensive_pathway_memberships`
22
 
23
+ ### Pandas DataFrame (5)
24
  - `edge_strata_by_node_species_type`
25
  - `edge_strata_by_edge_sbo_terms`
26
  - `species_identifiers`
 
50
 
51
  ### Configure DataConfig
52
 
53
+ You can also use this dataset in your `DataConfig` YAML for PyTorch Lightning experiments:
54
 
55
+ ```yaml
56
+ data:
57
+ store_dir: "./local_store"
58
+ hf_repo_id: "seanhacks/relation_prediction"
59
+ hf_revision: "main"
60
+ napistu_data_name: "relation_prediction"
 
 
 
 
 
 
 
 
 
61
  ```
62
 
63
  To make the store writable (non-read-only), provide paths to the raw data files:
64
 
65
+ ```yaml
66
+ data:
67
+ store_dir: "./local_store"
68
+ hf_repo_id: "seanhacks/relation_prediction"
69
+ hf_revision: "main"
70
+ sbml_dfs_path: "/path/to/sbml_dfs.pkl"
71
+ napistu_graph_path: "/path/to/napistu_graph.pkl"
72
+ napistu_data_name: "relation_prediction"
 
 
 
 
 
 
 
 
 
 
 
 
 
73
  ```
74
 
75
  ### Load Raw Data from GCS (Optional)
 
112
  store.ensure_artifacts(["new_artifact_name"])
113
  ```
114
 
115
+ ## Links
116
+
117
+ - 🌐 [Napistu](https://napistu.com)
118
+ - 💻 [GitHub Repository](https://github.com/napistu/Napistu-Torch)
119
+ - 📚 [Napistu Wiki](https://github.com/napistu/napistu/wiki)
120
+
121
  ## Citation
122
 
123
  If you use this dataset, please cite: