HyperForensics-plus-plus / contributor_README.md
OtoroLin's picture
extend contributor_README
1378110
|
raw
history blame
5.22 kB

HyperForensics++ Dataset Contribution Guide

This guide outlines the steps to contribute to the HyperForensics++ dataset repository.


1. Clone the Repository

First, clone the repository from Hugging Face to your local machine:

# Make sure git-lfs is installed (https://git-lfs.com)
git lfs install
git clone https://huggingface.co/datasets/OtoroLin/HyperForensics-plus-plus
cd ./HyperForensics-plus-plus

2. Extract the Dataset

Decompress the .tar.gz files into a separate local hirarchy. You can utilize the provided unzipping.sh script. First, modify the ROOT_DIR variable to the local_data path

# In zipping.sh
# Defined the root directory of the whole dataset
ROOT_DIR="/root/to/local_data"

Due to the redundancy of our dataset, we use pigz program instead of standar gipz program to compress the files. Make sure you install it first.

sudo apt install pigz
# If you are using miniconda
conda install pigz

2.1 Decompressing the specify method and configuration:

./unzipping.sh --method <method name> --config <config name>

Replace <method name> with the forgery method and <config name> with the configuration index. This will automatically decompressed .tar.gz file and put it under hirarchy of the local dataset directory. For instance, if I want to decompress the images under ADMM-ADAM forgery method, config0 configuration:

./unzipping.sh --method ADMM-ADAM --config 0

This will decompress HyperForensics-plus-plus/data/ADMM-ADAM/config0/config0.tar.gz to /path/to/local/ADMM-ADAM/config0/*.

2.2 Decompress the entire hirarchy

./unzipping.sh --all

This will recursively decompress all tar.gz file in the HyperForensics-plus-plus/data and automatically put the decompressed files under hirarchy of the local dataset directory.

3. Create a pull request

The pull requests on Hugging Face do not use forks and branches, but instead custom “branches” called refs that are stored directly on the source repo.

The advantage of using custom refs (like refs/pr/42 for instance) instead of branches is that they’re not fetched (by default) by people cloning the repo, but they can still be fetched on demand.

.1 Create a new PR on the hugging face

3. Modify or Generate Data

Make changes or generate new data in the extracted directory. Ensure the data follows the required structure:

local_data/
    |-{attacking method}/
    |   |-config0/
    |   |   |-images.npy
    |   |-config1/
    |   | ...
    |   |-config4/
    |-{attacking method}/
    |   |-config{0-4}/
    |-Origin
    |   |-images.npy
HyperForensics-plus-plus/
    |-data/

4. Zip the Directory

After modifying or generating data, zip the directory into .tar.gz files and place it in the repository. You can utilize the provided zipping.sh script. First, modify the ROOT_DIR variable to the local_data path

# In zipping.sh
# Defined the root directory of the whole dataset
ROOT_DIR="/root/to/local_data"

Due to the redundancy of our dataset, we use pigz program instead of standar gipz program to compress the files. Make sure you install it first.

sudo apt install pigz
# If you are using miniconda
conda install pigz

4.1 Compressing the specify method and configuration:

./zipping.sh --method <method name> --config <config name>

Replace with the forgery method and with the configuration index. This will automatically put the compressed .tar.gz file under hirarchy of the dataset repo. For instance, if I want to zip the images under ADMM-ADAM forgery method, config0 configuration:

./zipping.sh --method ADMM-ADAM --config 0

This will compress /path/to/local/ADMM-ADAM/config0/* to HyperForensics-plus-plus/data/ADMM-ADAM/config0/config0.tar.gz.

4.2 Compressing the specify direcotry

./zipping.sh --dir-path <path/to/directory>

Replace <path/to/directory> with the intended directory path. This will compress the directory and put it in the working directory. For instance,

./zipping.sh --dir-path /path/to/local/ADMM-ADAM/config0

This will compress /path/to/local/ADMM-ADAM/config0/* to HyperForensics-plus-plus/config0.tar.gz.

4.3 Compress the entire hirarchy

./zipping.sh --all

This will compress the entire local data hirarchy seperately and automatically put the compressed files under hirarchy of the dataset repo.

Note that I did not imiplement the method/config-specified compressing for Origin since it seems unlikely people needs to compress/decompress it regularly. You can either do it manually or use the path-specified compressing method and put it in to hirarchy by hand.

5. Create a Pull Request

Push your changes to create a pull request. Hugging Face uses git push to implement pull requests:

git push origin pr/<pr index>:refs/pr/<pr index>

Replace <pr index> with the pull request index.

6. Wait for merging the Pull Request

Once the pull request is created, the repository maintainer will review and merge it.


Thank you for contributing to the HyperForensics++ dataset!