This repo contains the source code for paper A Baseline Method for Removing Invisible Image Watermarks using Deep Image Prior.
If you find our work interesting or helpful, please consider a generous citation of our paper:
@article{liang2025baseline,
title={A Baseline Method for Removing Invisible Image Watermarks using Deep Image Prior},
author={Liang, Hengyue and Li, Taihui and Sun, Ju},
journal={arXiv preprint arXiv:2502.13998},
year={2025}
}
We recommend using conda env with the following installed
python==3.12
conda install pytorch==2.2.2 torchvision==0.17.2 pytorch-cuda=12.1 -c pytorch -c nvidia
Then, install the dependencies by:
pip install -r requirement.txt
Finally, run the following command to install the modified diffusers to implement the regeneration attack proposed in Invisible Image Watermarks Are Provably Removable Using Generative AI.
pip install -e .
To config the DiffPure method from Robustness of AI-Image Detectors: Fundamental Limits and Practical Attacks, first clone the official repo to PATH_THAT_YOU_LIKE:
cd PATH_THAT_YOU_LIKE
git clone https://github.com/mehrdadsaberi/watermark_robustness.git
Then run the corresponding bash file to download the official pretrained model:
cd watermark_robustness
bash _bash_download_models.sh
Then the entire folder of DiffPure into this repo:
cp -r DiffPure DIP_Watermark_Evasion/
If you have trouble installing compressai
, you may trace the code and comment out the parts that uses compressai
, which will only affects the vae regeneration benchmarks but not other methods.
Most baseline methods are largely adapted from paper Invisible Image Watermarks Are Provably Removable Using Generative AI and their public code.
Baseline "WeVadeBQ" is adapted from paper Evading Watermark based Detection of AI-Generated Content and their public code.
Only ways to pack the DiffPure method from Robustness of AI-Image Detectors: Fundamental Limits and Practical Attacks is provided in this project for convenience. Check the official repo (full version) for other methods they have proposed.
Special thanks to the original authors and their hard work!