File size: 5,216 Bytes
0ba52cd
 
 
 
 
 
 
 
 
 
 
 
7ffae14
0ba52cd
 
47295d2
0ba52cd
7ffae14
 
 
 
47295d2
 
 
 
 
 
7ffae14
 
 
0ba52cd
7ffae14
 
 
 
 
 
 
 
 
 
 
 
57d2434
 
 
 
 
 
 
0ba52cd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7ffae14
0ba52cd
 
 
 
 
 
47295d2
 
 
 
 
 
7ffae14
0ba52cd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57d2434
0ba52cd
 
 
 
 
 
 
 
c6abc5d
0ba52cd
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
# HyperForensics++ Dataset Contribution Guide

This guide outlines the steps to contribute to the HyperForensics++ dataset repository.

---

## 1. Clone the Repository
First, clone the repository from Hugging Face to your local machine:
```bash
# Make sure git-lfs is installed (https://git-lfs.com)
git lfs install
git clone https://huggingface.co/datasets/OtoroLin/HyperForensics-plus-plus
cd ./HyperForensics-plus-plus
```
## 2. Extract the Dataset
Decompress the `.tar.gz` files into a separate local hirarchy. You can utilize the provided `unzipping.sh` script. First, modify the `ROOT_DIR` variable to the `local_data` path
```bash
# In zipping.sh
# Defined the root directory of the whole dataset
ROOT_DIR="/root/to/local_data"
```
Due to the redundancy of our dataset, we use **`pigz`** program instead of standar `gipz` program to compress the files. Make sure you install it first.
```bash
sudo apt install pigz
# If you are using miniconda
conda install pigz
```
### 2.1 Decompressing the specify method and configuration:
```bash
./unzipping.sh --method <method name> --config <config name>
```
Replace `<method name>` with the forgery method and `<config name>` with the configuration index. This will automatically decompressed `.tar.gz` file and put it under hirarchy of the local dataset directory.
For instance, if I want to decompress the images under `ADMM-ADAM` forgery method, `config0` configuration:
```bash
./unzipping.sh --method ADMM-ADAM --config 0
```
This will decompress `HyperForensics-plus-plus/data/ADMM-ADAM/config0/config0.tar.gz` to `/path/to/local/ADMM-ADAM/config0/*`.
### 2.2 Decompress the entire hirarchy
```bash
./unzipping.sh --all
```
This will recursively decompress all `tar.gz` file in the `HyperForensics-plus-plus/data` and automatically put the decompressed files under hirarchy of the local dataset directory.

## 3. Create a pull request
The pull requests on Hugging Face do not use forks and branches, but instead custom “branches” called `refs` that are stored directly on the source repo.

The advantage of using custom `refs` (like `refs/pr/42` for instance) instead of branches is that they’re not fetched (by default) by people cloning the repo, but they can still be fetched on demand.
### .1 Create a new PR on the hugging face


## 3. Modify or Generate Data
Make changes or generate new data in the extracted directory. Ensure the data follows the required structure:
```
local_data/
    |-{attacking method}/
    |   |-config0/
    |   |   |-images.npy
    |   |-config1/
    |   | ...
    |   |-config4/
    |-{attacking method}/
    |   |-config{0-4}/
    |-Origin
    |   |-images.npy
HyperForensics-plus-plus/
    |-data/
```
## 4. Zip the Directory
After modifying or generating data, zip the directory into `.tar.gz` files and place it in the repository.
You can utilize the provided `zipping.sh` script. First, modify the `ROOT_DIR` variable to the `local_data` path
```bash
# In zipping.sh
# Defined the root directory of the whole dataset
ROOT_DIR="/root/to/local_data"
```
Due to the redundancy of our dataset, we use **`pigz`** program instead of standar `gipz` program to compress the files. Make sure you install it first.
```bash
sudo apt install pigz
# If you are using miniconda
conda install pigz
```
### 4.1 Compressing the specify method and configuration:
```bash
./zipping.sh --method <method name> --config <config name>
```
Replace <method name> with the forgery method and <config name> with the configuration index. This will automatically put the compressed `.tar.gz` file under hirarchy of the dataset repo.
For instance, if I want to zip the images under `ADMM-ADAM` forgery method, `config0` configuration:
```bash
./zipping.sh --method ADMM-ADAM --config 0
```
This will compress `/path/to/local/ADMM-ADAM/config0/*` to `HyperForensics-plus-plus/data/ADMM-ADAM/config0/config0.tar.gz`.
### 4.2 Compressing the specify direcotry
```bash
./zipping.sh --dir-path <path/to/directory>
```
Replace `<path/to/directory>` with the intended directory path. This will compress the directory and put it in the working directory.
For instance,
```bash
./zipping.sh --dir-path /path/to/local/ADMM-ADAM/config0
```
This will compress `/path/to/local/ADMM-ADAM/config0/*` to `HyperForensics-plus-plus/config0.tar.gz`.
### 4.3 Compress the entire hirarchy
```bash
./zipping.sh --all
```
This will compress the entire local data hirarchy seperately and automatically put the compressed files under hirarchy of the dataset repo.
> **Note** that I did not imiplement the method/config-specified compressing for **Origin** since it seems unlikely people needs to compress/decompress it regularly. You can either do it manually or use the path-specified compressing method and put it in to hirarchy by hand.
## 5. Create a Pull Request
Push your changes to create a pull request. Hugging Face uses `git push` to implement pull requests:
```bash
git push origin pr/<pr index>:refs/pr/<pr index>
```
Replace `<pr index>` with the pull request index.
## 6. Wait for merging the Pull Request
Once the pull request is created, the repository maintainer will review and merge it.

---
Thank you for contributing to the HyperForensics++ dataset!