File size: 3,935 Bytes
7c2a170
 
 
 
 
0c7cb10
7c2a170
0c7cb10
7c2a170
0c7cb10
01878a0
 
 
d43025f
 
 
 
 
 
0349740
d43025f
e546285
d43025f
 
 
 
 
 
 
 
 
 
1b55bf4
7c2a170
0c7cb10
7c2a170
35a8a48
7c2a170
d43025f
7c2a170
0c7cb10
7c2a170
0c7cb10
 
7c2a170
0c7cb10
 
 
 
 
 
7c2a170
 
d43025f
 
 
7c2a170
d43025f
 
 
 
 
7c2a170
 
 
 
 
 
 
 
 
 
 
 
 
d43025f
 
7c2a170
e546285
 
 
 
 
 
 
 
 
7c2a170
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
license: bsd
---
# NeuMERL

The Neural Augmented [MERL dataset](https://www.merl.com/research/downloads/BRDF) (NeuMERL, 2400 BRDFs) dataset adopted in the paper [NeuMaDiff: Neural Material Synthesis via Hyperdiffusion](https://arxiv.org/abs/2411.12015).

Please download it at [./NeuMERL-2400.npy](./NeuMERL-2400.npy), with Pytorch weights of shape (2400, 675).

Alternatively, please download them separately (for i from 1 to 24) at [./NeuMERL(24*100)/mlp_weights_all_{i}.npy](https://huggingface.co/datasets/Peter2023HuggingFace/NeuMERL/tree/main/NeuMERL(24*100)).

![img-visual](./img/visual.png)

## Usage

The dataset is stored in a numpy file, with all the MLP weights linearized in a single array. For NBRDF MLP architecture use, please refer to [./nbrdf-release.py](./nbrdf-release.py)

- Input: Cartesian coordinate for positional samples (hx, hy, hz, dx, dy, dz).
- Output: MERL reflectance value.
- To convert it back to model weights `.pth`, update the example [example-brdf.pth](./example-brdf.pth) (alum-bronze).
  - To then convert it back to MERL binary, load the model from `.pth` and record the inference results with all positional input.
  - For details, please refer to [Github](https://github.com/PeterHUistyping/M3ashy) repo.

```
fc1.weight       (21, 6)
fc1.bias         (21,)
fc2.weight       (21, 21)
fc2.bias         (21,)
fc3.weight       (3, 21)
fc3.bias         (3,)
```

## Dataset formation

To form the AugMERL dataset, we first augment the original [MERL dataset](https://www.merl.com/research/downloads/BRDF) with color channel permutation. The first group materials (1-600) are all without interpolation. Then, we augment the BRDFs via direct linear interpolation, forming three groups of materials (601-1200, 1201-1800, 1801-2400), where each group follows the same color channel permutation as the first group ([Sec 3.1](https://arxiv.org/abs/2411.12015)).

Next, we adopt neural fields as a low-dimensional, continuous representation for materials, fitting them to individual materials in Aug-MERL to create a new dataset of neural material representations, Neural MERL (NeuMERL). For weight initialization, we use the same seed for all materials, see [mlp_weights_ini.pth](./mlp_weights_ini.pth).

### Color channel permutation

Color channel permutation: RGB (1-100), RBG (101-200), GRB (201-300), GBR (301-400), BRG (401-500), GRB (501-600), similarly afterwards.

```python
file_index = mat_id // 100
rgb_type = file_index % 6
if rgb_type == 1:   brdf = brdf[..., [0, 1, 2]]  # merl_1
elif rgb_type == 2: brdf = brdf[..., [0, 2, 1]]  # merl_2
elif rgb_type == 3: brdf = brdf[..., [1, 0, 2]]  # merl_3
elif rgb_type == 4: brdf = brdf[..., [1, 2, 0]]  # merl_4
elif rgb_type == 5: brdf = brdf[..., [2, 0, 1]]  # merl_5
elif rgb_type == 0: brdf = brdf[..., [2, 1, 0]]  # merl_6
```

### BRDF interpolation

For material from 601 to 2400, we interpolate the BRDFs of two materials from the original MERL dataset, forming a new BRDF.

- For 601-1200, i=0;
- For 1201-1800, i=1;
- For 1801-2400, i=2.

Please refer to [./interpolate/interpolate_interpolation_{i}.txt](https://huggingface.co/datasets/Peter2023HuggingFace/NeuMERL/tree/main/interpolate), each line is in the format of

```
MERL_BRDF1_name MERL_BRDF2_name \alpha 
```

, where $\alpha \in [0.3, 0.7]$ is the linear interpolation coefficient, outputting the interpolated BRDF

$$
f^{*}_r = \alpha \times f^{1}_r + (1-\alpha) \times f^{2}_r
$$

## Citation

If you found the paper or base model useful, please consider citing,

```
@inproceedings{
    M3ashy2026, 
    author = {Chenliang Zhou and Zheyuan Hu and Alejandro Sztrajman and Yancheng Cai and Yaru Liu and Cengiz Oztireli}, 
    title = {M$^{3}$ashy: Multi-Modal Material Synthesis via Hyperdiffusion}, 
    year = {2026}, 
    booktitle = {Proceedings of the 40th AAAI Conference on Artificial Intelligence}, 
    location = {Singapore}, 
    series = {AAAI'26} 
}
```