|
|
# HyperForensics++ Dataset Contribution Guide |
|
|
|
|
|
This guide outlines the steps to contribute to the HyperForensics++ dataset repository. |
|
|
|
|
|
--- |
|
|
|
|
|
## 1. Clone the Repository |
|
|
First, clone the repository from Hugging Face to your local machine: |
|
|
```bash |
|
|
# Make sure git-lfs is installed (https://git-lfs.com) |
|
|
git lfs install |
|
|
git clone https://huggingface.co/datasets/OtoroLin/HyperForensics-plus-plus |
|
|
cd ./HyperForensics-plus-plus |
|
|
``` |
|
|
## 2. Extract the Dataset |
|
|
Decompress the `.tar.gz` files into a separate local hierarchy. You can utilize the provided `unzipping.sh` script. First, modify the `ROOT_DIR` variable to the `local_data` path |
|
|
```bash |
|
|
# In zipping.sh |
|
|
# Defined the root directory of the whole dataset |
|
|
ROOT_DIR="/root/to/local_data" |
|
|
``` |
|
|
Due to the redundancy of our dataset, we use **`pigz`** program instead of standar `gipz` program to compress the files. Make sure you install it first. |
|
|
```bash |
|
|
sudo apt install pigz |
|
|
# If you are using miniconda |
|
|
conda install pigz |
|
|
``` |
|
|
### 2.1 Decompressing the specify method and configuration: |
|
|
```bash |
|
|
./unzipping.sh --method <method name> --config <config name> |
|
|
``` |
|
|
Replace `<method name>` with the forgery method and `<config name>` with the configuration index. This will automatically decompressed `.tar.gz` file and put it under hierarchy of the local dataset directory. |
|
|
For instance, if I want to decompress the images under `ADMM-ADAM` forgery method, `config0` configuration: |
|
|
```bash |
|
|
./unzipping.sh --method ADMM-ADAM --config 0 |
|
|
``` |
|
|
This will decompress `HyperForensics-plus-plus/data/ADMM-ADAM/config0/config0.tar.gz` to `/path/to/local/ADMM-ADAM/config0/*`. |
|
|
### 2.2 Decompress the entire hierarchy |
|
|
```bash |
|
|
./unzipping.sh --all |
|
|
``` |
|
|
This will recursively decompress all `tar.gz` file in the `HyperForensics-plus-plus/data` and automatically put the decompressed files under hierarchy of the local dataset directory. |
|
|
|
|
|
## 3. [Create a pull request](https://huggingface.co/docs/hub/repositories-pull-requests-discussions#pull-requests-advanced-usage) |
|
|
The pull requests on Hugging Face do not use forks and branches, but instead custom “branches” called `refs` that are stored directly on the source repo. |
|
|
|
|
|
The advantage of using custom `refs` (like `refs/pr/42` for instance) instead of branches is that they’re not fetched (by default) by people cloning the repo, but they can still be fetched on demand. |
|
|
### 3.1 Create a new PR on the hugging face |
|
|
Go to *community* $\to$ *New pull request* $\to$ *On your machine* $\to$ enter *Pull request title* $\to$ *Creat PR branch* |
|
|
|
|
|
Then, you shall see a brand new PR page with the title of your PR on the top left. Make note of the index number next to the title `#{index}`. |
|
|
> In this step, you actually create a new custom branch on **remote** under reference of `ref/pr/#{index}`. |
|
|
|
|
|
The PR is still in **draft mode**, the maintainer can not merge the PR untill you publish the PR. |
|
|
### 3.2 Create a new PR branch on local |
|
|
```bash |
|
|
git fetch origin refs/pr/<index> |
|
|
git checkout pr/<index> |
|
|
``` |
|
|
Fetch the remote PR branch `ref/pr/<index>` to local PR branch. Then checkout to the newly created custom branch under references `pr/`, and just for unambiguity, using the PR index as the name of that reference. You can actually create a new branch locally here as usual. |
|
|
### 3.3 Push the PR branch |
|
|
After you finish your modification, push the local PR branch to remote Hugging Face. Check it out [here at 6](#6-create-a-pull-request). |
|
|
|
|
|
## 4. Modify or Generate Data |
|
|
Make changes or generate new data in the extracted directory. Ensure the data follows the required structure: |
|
|
``` |
|
|
local_data/ |
|
|
|-{attacking method}/ |
|
|
| |-config0/ |
|
|
| | |-images.npy |
|
|
| |-config1/ |
|
|
| | ... |
|
|
| |-config4/ |
|
|
|-{attacking method}/ |
|
|
| |-config{0-4}/ |
|
|
|-Origin |
|
|
| |-images.npy |
|
|
HyperForensics-plus-plus/ |
|
|
|-data/ |
|
|
``` |
|
|
## 5. Zip the Directory |
|
|
After modifying or generating data, zip the directory into `.tar.gz` files and place it in the repository. |
|
|
You can utilize the provided `zipping.sh` script. First, modify the `ROOT_DIR` variable to the `local_data` path |
|
|
```bash |
|
|
# In zipping.sh |
|
|
# Defined the root directory of the whole dataset |
|
|
ROOT_DIR="/root/to/local_data" |
|
|
``` |
|
|
Due to the redundancy of our dataset, we use **`pigz`** program instead of standar `gipz` program to compress the files. Make sure you install it first. |
|
|
```bash |
|
|
sudo apt install pigz |
|
|
# If you are using miniconda |
|
|
conda install pigz |
|
|
``` |
|
|
### 5.1 Compressing the specify method and configuration: |
|
|
```bash |
|
|
./zipping.sh --method <method name> --config <config name> |
|
|
``` |
|
|
Replace <method name> with the forgery method and <config name> with the configuration index. This will automatically put the compressed `.tar.gz` file under hierarchy of the dataset repo. |
|
|
For instance, if I want to zip the images under `ADMM-ADAM` forgery method, `config0` configuration: |
|
|
```bash |
|
|
./zipping.sh --method ADMM-ADAM --config 0 |
|
|
``` |
|
|
This will compress `/path/to/local/ADMM-ADAM/config0/*` to `HyperForensics-plus-plus/data/ADMM-ADAM/config0/config0.tar.gz`. |
|
|
### 5.2 Compressing the specify direcotry |
|
|
```bash |
|
|
./zipping.sh --dir-path <path/to/directory> |
|
|
``` |
|
|
Replace `<path/to/directory>` with the intended directory path. This will compress the directory and put it in the working directory. |
|
|
For instance, |
|
|
```bash |
|
|
./zipping.sh --dir-path /path/to/local/ADMM-ADAM/config0 |
|
|
``` |
|
|
This will compress `/path/to/local/ADMM-ADAM/config0/*` to `HyperForensics-plus-plus/config0.tar.gz`. |
|
|
### 5.3 Compress the entire hierarchy |
|
|
```bash |
|
|
./zipping.sh --all |
|
|
``` |
|
|
This will compress the entire local data hierarchy seperately and automatically put the compressed files under hierarchy of the dataset repo. |
|
|
> **Note** that I did not imiplement the method/config-specified compressing for **Origin** since it seems unlikely people needs to compress/decompress it regularly. You can either do it manually or use the path-specified compressing method and put it in to hierarchy by hand. |
|
|
|
|
|
## 6. Create a Pull Request |
|
|
Push your changes to created PR: |
|
|
```bash |
|
|
git push origin pr/<index>:refs/pr/<index> |
|
|
``` |
|
|
Replace `<index>` with the pull request index. And click the ***Publish*** button on the buttom of the page. |
|
|
> You can aso leave comment for your PR to showcase where you have modified. |
|
|
## 6. Wait for merging the Pull Request |
|
|
Once the pull request is published, the repository maintainer will review and merge it. |
|
|
|
|
|
--- |
|
|
Thank you for contributing to the HyperForensics++ dataset! |
|
|
|
|
|
## Divide my study into three main scope |
|
|
1. The data generation |
|
|
2. The quality matricies to determind the quality of our dataset |
|
|
3. The forgery detection baseline models |
|
|
### The data generation |
|
|
#### Removal Method |
|
|
This scope has been almost done by min-zuo with two algorithm 1. ADMM-ADAM; 2. FasyHyIn. |
|
|
> However, since both strategy looks into spectral domain to inpaint the area. I have to look carefully to make sure it actually **remove** the data instead of recovering it. |
|
|
> How to do that? I need a **classifier or segmentation model** to detect the image making sure it has a wrong output |
|
|
#### Object replacement |
|
|
This scope is under min-zuo's hand. He use CRS_diff diffusion model + harmonization to generate the replacemnt example. |
|
|
:::warning |
|
|
The images example above water are hard the current bottleneck. |
|
|
Needs more attention |
|
|
::: |
|
|
#### Adv attack |
|
|
This is implemented by 恩召, I did not actually ask much about it. Choose one adv-attack strategy. Focusing on highly [transibale](https://zhuanlan.zhihu.com/p/686734868) attack |
|
|
|
|
|
### Qaulity matrics |
|
|
#### Spectral Consistency Metrics |
|
|
These metrics are used in HSI-tampering, which are used to ensure that the forged region's spectrum matches the surrounding context |
|
|
1. SAM |
|
|
2. Spectral RMSE |
|
|
3. Pearson correlation (need more info) |
|
|
4. EGRAS (need more info) |
|
|
#### Forgery Detectability Metrics |
|
|
I think this is the most important one since we are making forgery dataset. We want to show: |
|
|
- rate of forgery images being mis-classified is high |
|
|
1. FNR |
|
|
2. Attack Success Rate |
|
|
3. Detection Score Drop (detector confidence before/after forgery) |
|
|
We can use either a classification model or segmentation model |
|
|
- rate of forgeries not detected by a detection model is low |
|
|
#### Visual Inspection |
|
|
1. User studies - using false color? |
|
|
2. Visual result figures (origin vs. forged vs. mask) |
|
|
#### Visual Quality Matrics |
|
|
This is relatively not important since this is a forgery dataset, how similar the forgery image is to the ground truth can not evaluate the goodness of the forgery |
|
|
1. PSNR |
|
|
2. SSIM |
|
|
3. LPIPS(Learned Perceptual Image Patch Similarity) |
|
|
### Forgery Detection Baseline Model |
|
|
From previous paragraph [Forgery Detectability Metrics](#forgery-detectability-metrics). We need: |
|
|
1. Pixel wise existing forgery detection model |
|
|
2. Our own modern SOTA detection model for each attacking method |