Datasets:

Modalities:
Image
Size:
< 1K
ArXiv:
Libraries:
Datasets
License:
Rene commited on
Commit
7ca0e73
·
1 Parent(s): 4a2895a

Improved readme and added demo notebook

Browse files
Files changed (4) hide show
  1. README.md +70 -2
  2. Visualizer.ipynb +0 -0
  3. data.png +3 -0
  4. training.png +3 -0
README.md CHANGED
@@ -8,7 +8,8 @@ This dataset contains the data for the first test case (1D compressible SPH) for
8
 
9
  You can find the full paper [here](https://arxiv.org/abs/2403.16680).
10
 
11
- The source core repository is available [here](https://github.com/tum-pbs/SFBC/) and also contains information on the data generation
 
12
 
13
  For the other test case datasets look here:
14
 
@@ -22,4 +23,71 @@ For the other test case datasets look here:
22
 
23
  ## File Layout
24
 
25
- The datasets are stored as hdf5 files with a single file per experiment. Within each file there is a set of configuration parameters and each frame of the simulation stored separately as a group. Each frame contains information for all fluid particles and all potentially relevant information. For the 2D test cases there is a pre-defined test/train split on a simulation level, wheras the 1D and 3D cases do not contain such a split.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
 
9
  You can find the full paper [here](https://arxiv.org/abs/2403.16680).
10
 
11
+ The source core repository is available [here](https://github.com/tum-pbs/SFBC/) and also contains information on the data generation. You can install our BasisConvolution framework simply by running
12
+ `pip install BasisConvolution`
13
 
14
  For the other test case datasets look here:
15
 
 
23
 
24
  ## File Layout
25
 
26
+ The datasets are stored as hdf5 files with a single file per experiment. Within each file there is a set of configuration parameters and each frame of the simulation stored separately as a group. Each frame contains information for all fluid particles and all potentially relevant information. For the 2D test cases there is a pre-defined test/train split on a simulation level, wheras the 1D and 3D cases do not contain such a split.
27
+
28
+ ## Demonstration
29
+
30
+ This repository contains a simple Jupyter notebook (Visualizer.ipynb) that loads the dataset in its current folder and visualizes it first:
31
+
32
+ ![alt text](data.png)
33
+
34
+ And then runs a simple training on it to learn the SPH summation-based density for different basis functions:
35
+
36
+ ![alt text](training.png)
37
+
38
+ ## Minimum Working Example
39
+
40
+ Below you can find a fully work but simple example of loading our dataset, building a network (based on our SFBC framework) and doing a single network step. This relies on our SFBC/BasisConvolution framework that you can find [here](https://github.com/tum-pbs/SFBC/) or simply install it via pip (`pip install BasisConvolution`)
41
+
42
+ ```py
43
+ from BasisConvolution.util.hyperparameters import parseHyperParameters, finalizeHyperParameters
44
+ from BasisConvolution.util.network import buildModel, runInference
45
+ from BasisConvolution.util.augment import loadAugmentedBatch
46
+ from BasisConvolution.util.arguments import parser
47
+ import shlex
48
+ import torch
49
+ from torch.utils.data import DataLoader
50
+ from BasisConvolution.util.dataloader import datasetLoader, processFolder
51
+
52
+ # Example arguments
53
+ args = parser.parse_args(shlex.split(f'--fluidFeatures constant:1 --boundaryFeatures constant:1 --groundTruth compute[rho]:constant:1/constant:rho0 --basisFunctions ffourier --basisTerms 4 --windowFunction "None" --maxUnroll 0 --frameDistance 0 --epochs 1'))
54
+ # Parse the arguments
55
+ hyperParameterDict = parseHyperParameters(args, None)
56
+ hyperParameterDict['device'] = 'cuda' # make sure to use a gpu if you can
57
+ hyperParameterDict['iterations'] = 2**10 # Works good enough for this toy problem
58
+ hyperParameterDict['batchSize'] = 4 # Automatic batched loading is supported
59
+ hyperParameterDict['boundary'] = False # Make sure the data loader does not expect boundary data (this yields a warning if not set)
60
+
61
+ # Build the dataset
62
+ datasetPath = 'dataset'
63
+ train_ds = datasetLoader(processFolder(hyperParameterDict, datasetPath))
64
+ # And its respective loader/iterator combo as a batch sampler (this is our preferred method)
65
+ train_loader = DataLoader(train_ds, shuffle=True, batch_size = hyperParameterDict['batchSize']).batch_sampler
66
+ train_iter = iter(train_loader)
67
+ # Align the hyperparameters with the dataset, e.g., dimensionality
68
+ finalizeHyperParameters(hyperParameterDict, train_ds)
69
+ # Build a model for the given hyperparameters
70
+ model, optimizer, scheduler = buildModel(hyperParameterDict, verbose = False)
71
+ # Get a batch of data
72
+
73
+ try:
74
+ bdata = next(train_iter)
75
+ except StopIteration:
76
+ train_iter = iter(train_loader)
77
+ bdata = next(train_iter)
78
+ # Load the data, the data loader does augmentation and neighbor searching automatically
79
+ configs, attributes, currentStates, priorStates, trajectoryStates = loadAugmentedBatch(bdata, train_ds, hyperParameterDict)
80
+ # Run the forward pass
81
+ optimizer.zero_grad()
82
+ predictions = runInference(currentStates, configs, model, verbose = False)
83
+ # Compute the Loss
84
+ gts = [traj[0]['fluid']['target'] for traj in trajectoryStates]
85
+ losses = [torch.nn.functional.mse_loss(prediction, gt) for prediction, gt in zip(predictions, gts)]
86
+ # Run the backward pass
87
+ loss = torch.stack(losses).mean()
88
+ loss.backward()
89
+ optimizer.step()
90
+ # Print the loss
91
+ print(loss.item())
92
+ print('Done')
93
+ ```
Visualizer.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
data.png ADDED

Git LFS Details

  • SHA256: 77977621aa2f3c084345714d03a6979e4f9745fac120e650541de4b7869d032e
  • Pointer size: 131 Bytes
  • Size of remote file: 124 kB
training.png ADDED

Git LFS Details

  • SHA256: ed8c5ca8bb182d503b1834cfb2259d7fa10603bdd87e3d1032a55a123e413107
  • Pointer size: 131 Bytes
  • Size of remote file: 116 kB