[CVPR2021] Meta Batch-Instance Normalization for Generalizable Person Re-Identification
Feel free to visit my homepage and awesome person re-id github page
<Illustration of unsuccessful generalization scenarios and our framework>
git clone our_repository
conda create -n MetaBIN python=3.6
conda activate MetaBIN
conda install pytorch==1.7.0 torchvision==0.8.0 torchaudio==0.7.0 cudatoolkit=10.1 -c pytorch
pip install tensorboard
pip install Cython
pip install yacs
pip install termcolor
pip install tabulate
pip install scikit-learn
pip install h5py
pip install imageio
pip install openpyxl
pip install matplotlib
pip install pandas
pip install seaborn
Download our model [link] to MetaBIN/logs/Sample/DG-mobilenet
├── MetaBIN/logs/Sample/DG-mobilenet
│ ├── last_checkpoint
│ ├── model_0099999.pth
│ ├── result.png
Download test datasets [link] to MetaBIN/datasets/
├── MetaBIN/datasets
│ ├── GRID
│ ├── prid_2011
│ ├── QMUL-iLIDS
│ ├── viper
Execute run_filecd MetaBIN/
sh run_evaluate.sh
you can get the following results
Datasets | Rank-1 | Rank-5 | Rank-10 | mAP | mINP | TPR@FPR=0.0001 | TPR@FPR=0.001 | TPR@FPR=0.01 |
---|---|---|---|---|---|---|---|---|
ALL_GRID_average | 49.68% | 67.52% | 76.80% | 58.10% | 58.10% | 0.00% | 0.00% | 46.35% |
ALL_GRID_std | 2.30% | 3.56% | 3.14% | 2.58% | 2.58% | 0.00% | 0.00% | 26.49% |
ALL_VIPER_only_10_average | 56.90% | 76.71% | 82.03% | 65.98% | 65.98% | 0.00% | 0.00% | 50.97% |
ALL_VIPER_only_10_std | 2.97% | 2.11% | 2.06% | 2.35% | 2.35% | 0.00% | 0.00% | 8.45% |
ALL_PRID_average | 72.50% | 88.20% | 91.30% | 79.78% | 79.78% | 0.00% | 0.00% | 91.00% |
ALL_PRID_std | 2.20% | 2.60% | 2.00% | 1.88% | 1.88% | 0.00% | 0.00% | 1.47% |
ALL_iLIDS_average | 79.67% | 93.33% | 97.33% | 85.51% | 85.51% | 0.00% | 0.00% | 56.13% |
ALL_iLIDS_std | 4.40% | 2.47% | 2.26% | 2.80% | 2.80% | 0.00% | 0.00% | 15.77% |
all_average | 64.69% | 81.44% | 86.86% | 72.34% | 72.34% | 0.00% | 0.00% | 61.11% |
MetaBIN/
├── configs/
├── datasets/ (*need to download and connect it by symbolic link [check section 4], please check the folder name*)
│ ├── *cuhk02
│ ├── *cuhk03
│ ├── *CUHK-SYSU
│ ├── *DukeMTMC-reID
│ ├── *GRID
│ ├── *Market-1501-v15.09.15
│ ├── *prid_2011
│ ├── *QMUL-iLIDS
│ ├── *viper
├── demo/
├── fastreid/
├── logs/
├── pretrained/
├── tests/
├── tools/
'*' means symbolic links which you make (check below sections)
Download dataset
Symbolic link (recommended)
symbolic_link_dataset.sh
cd MetaBIN
bash symbolic_link_dataset.sh
Direct connect (not recommended)
./datasets/
Symbolic link (recommended)
├── MetaBIN
│ ├── configs/
│ ├── ....
│ ├── tools/
├── MetaBIN(logs)
├── MetaBIN(pretrained)
cd MetaBIN
bash symbolic_link_others.sh
Direct connect (not recommended)
MetaBIN
pretrained
If you run code in pycharm
your folders/MetaBIN/
--config-file ./configs/Sample/DG-mobilenet.yml
Single GPU
python3 ./tools/train_net.py --config-file ./configs/Sample/DG-mobilenet.yml
python3 ./tools/train_net.py --config-file ./configs/Sample/DG-mobilenet.yml MODEL.DEVICE "cuda:0"
last_checkpoint
file in logs)python3 ./tools/train_net.py --config-file ./configs/Sample/DG-mobilenet.yml --resume
python3 ./tools/train_net.py --config-file ./configs/Sample/DG-mobilenet.yml --eval-only
(1) Market1501
Market-1501-v15.09.15
Market-1501-v15.09.15
from link and extract the files.
Market-1501-v15.09.15/
├── bounding_box_test/
├── bounding_box_train/
├── gt_bbox/
├── gt_query/
├── query/
(2) DukeMTMC-reID
DukeMTMC-reID
DukeMTMC-reID
from link and extract the files.
DukeMTMC-reID/
├── bounding_box_test/
├── bounding_box_train/
├── query/
(3) CUHK02
cuhk02
foldercuhk02
.
cuhk02/
├── P1/
├── P2/
├── P3/
├── P4/
├── P5/
(4) CUHK03
cuhk03
foldercuhk03
from link and extract “cuhk03_release.zip”, resulting in “cuhk03/cuhk03_release/”.cuhk03
.
cuhk03/
├── cuhk03_release/
├── cuhk03_new_protocol_config_detected.mat
├── cuhk03_new_protocol_config_labeled.mat
(5) Person Search (CUHK-SYSU)
CUHK-SYSU
CUHK-SYSU
from link and extract the files.make_cropped_image.m
(this code is included in the datasets folder)
CUHK-SYSU/
├── annotation/
├── Image/
├── cropped_image/
├── make_cropped_image.m (my matlab code)
(6) GRID
GRID
GRID
from link and extract the files.splits.json
) can be created by python code grid.py
GRID/
├── gallery/
├── probe/
├── splits_single_shot.json (This will be created by `grid.py` in `fastreid/data/datasets/` folder)
(7) PRID
prid_2011
prid_2011
from link and extract the files.splits_single_shot.json
) can be created by python code prid.py
prid_2011/
├── single_shot/
├── multi_shot/
├── splits_single_shot.json (This will be created by `prid.py` in `fastreid/data/datasets/` folder)
(8) QMUL i-LIDS
QMUL_iLIDS
QMUL_iLIDS
from the upper linksiLIDS.py
QMUL-iLIDS/
├── images/
├── splits.json (This will be created by `iLIDS.py` in `fastreid/data/datasets/` folder)
(9) VIPer
viper
viper
from link and extract the files.make_split.m
(this code is included in the datasets folder)
viper/
├── cam_a/
├── cam_b/
├── make_split.m (my matlab code)
├── split_1a # Train: split1, Test: split2 ([query]cam1->[gallery]cam2)
├── split_1b # Train: split2, Test: split1 (cam1->cam2)
├── split_1c # Train: split1, Test: split2 (cam2->cam1)
├── split_1d # Train: split2, Test: split1 (cam2->cam1)
...
...
├── split_10a
├── split_10b
├── split_10c
├── split_10d
Our code is based on fastreid link
fastreid/config/defaults.py: default settings (parameters)
fastreid/data/datasets/: about datasets
tools/train_net.py: Main code (train/test/tsne/visualize)
/MetaBIN/configs/Sample/DG-mobilenet.yml
cd fastreid/evaluation/rank_cylib; make all
logs
(section 3)pretrained
(section 6)datasets
(section 8)
@InProceedings{choi2021metabin,
title = {Meta Batch-Instance Normalization for Generalizable Person Re-Identification},
author = {Choi, Seokeon and Kim, Taekyung and Jeong, Minki and Park, Hyoungseob and Kim, Changick},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2021}
}