Source code for the paper MinBackProp – Backpropagating through Minimal Solvers
We evaluate our MinBackProp on outlier detection for essential matrix estimation. This code is based on the baseline ∇-RANSAC we compare with; the forward pass is the same for both the baseline and MinBackProp, while the backward pass differs (
Install the required packages
python = 3.8.10
pytorch = 1.12.1
opencv = 3.4.2
tqdm
kornia
kornia_moons
tensorboardX
scikit-learn
einops
yacs
For inference, build MAGSAC++ with
git clone https://github.com/disungatullina/magsac.git --recursive
cd magsac
mkdir build
cd build
cmake ..
make
cd ..
python setup.py install
Then clone the project with submodules
git clone https://github.com/disungatullina/MinBackProp.git --recurse-submodules -j8
cd MinBackProp
Use -ift 1
for the IFT, -ift 2
for the DDN, and -ift 0
for Autograd (baseline). Default is -ift 1
.
python train.py -ift 1 -nf 2000 -m pretrained_models/weights_init_net_3_sampler_0_epoch_1000_E_rs_r0.80_t0.00_w1_1.00_.net -bs 32 -e 10 -tr 1 -t 0.75 -pth <data_path>
Models for inference are stored in the models
directory.
python test_magsac.py -nf 2000 -m models/ift.net -bs 32 -bm 1 -t 2 -pth <data_path>
Download the RootSIFT features of the PhotoTourism dataset from here.
-ift: backprop method to use, 0-autograd, 1-ift, 2-ddn, default=1
-pth: path to the dataset
-nf: number of features, default=2000
-m: pretrained model to init or trained model for inference
-bs: batch size, default=32
-e: the number of epochs, default=10
-tr: train or test mode, default=0
-t: threshold, default=0.75
-lr: learning rate, default=1e-4
-bm: batch mode, using all the 12 testing scenes defined in utils.py, default=0
-ds: name of a scene, if single scene used, default="st_peters_square"
See more command line arguments in utils.py
.
The current Docker image is built for CPU usage. If you need a GPU version, please modify the Dockerfile
accordingly.
docker pull dsungatullina/minbackprop
docker run -it dsungatullina/minbackprop
For training, run the following command
python3 train.py -ift 1 -nf 2000 -m pretrained_models/weights_init_net_3_sampler_0_epoch_1000_E_rs_r0.80_t0.00_w1_1.00_.net -bs 32 -e 10 -tr 1 -t 0.75 -pth data
To start the inference, run
python3 test_magsac.py -nf 2000 -m models/ift.net -bs 32 -bm 1 -t 2 -pth data
cd toy_examples
python estimate_rotation.py --ift --ddn --autograd --plot
cd toy_examples
python estimate_fundamental.py --ift --ddn --autograd --plot
If you use our algorithm, please cite
@ARTICLE{Sungatullina2024-A83,
author={Diana Sungatullina and Tomas Pajdla},
title={MinBackProp – Backpropagating through Minimal Solvers},
journal={Journal of WSCG},
year={2024},
volume={32},
number = {1-2},
pages={41-50},
doi={10.24132/JWSCG.2024.5},
publisher={Union Agency, Science Press},
issn={1213-6972},
abbrev_source_title={J WSCG},
document_type={Article},
}