Image Retrieval-MS Loss-Paper Reproduction

Written in front:

 

This is a work of personal study, not for commercial use, but for reference only.

Ability is limited, common progress.

Original address of the paper : Multi-Similarity Loss with General Pair Weighting for Deep Metric Learning

It was researched by the boss of Shenzhen Malong Artificial Intelligence Research Center.

GIT address

first step:

git clone https://github.com/msight-tech/research-ms-loss
cd reseach-ms-loss
pip install -r requirements.txt
python setup.py develop build

Step two:

Dataset download:

The address in git has expired, and an error will be reported if it is run directly.

Here is the latest download address  download data

 At this time, you need to create a new file directory outside

mkdir resource/datasets

Put the downloaded CUB_200_2011.tgz into this directory.

tar -zxf CUB_200_2011.tgz

Then execute the following command

python scripts/split_cub_for_ms_loss.py

third step. model download,

Download the trained model as required, bn_inception-52deb4733.pth

Note that you need to modify the configuration file.

 The location of the model path should be consistent with the location you put it.

Then run in the file directory

./scripts/run_cub.sh

Here I report an error, reminding me that the reference address is incorrect,

I read this startup script, the core is tool/main.py, I copied main.py, the path is shown in the figure:

The running result is shown in the figure.

Step 4: Training completed:

 The training time is about 24 minutes. The optimal recall rate is 0.65. The output model is under the output file.

Dataset introduction:

There are 200 species in this data set, and each species has about 10-30 pictures.

Introduction to the training process

 I use 3090, the configuration is as follows:

Guess you like

Origin blog.csdn.net/qq_33083551/article/details/126933478