Conference on Neural Information Processing Systems, 2021 (NeurIPS)
Xi Shen1 Yang Xiao1 Shell Xu Hu2 Othman Sbai1, 3 Mathieu Aubry1
1LIGM (UMR 8049) - Ecole des Ponts, UPE 2Samsung AI Center, Cambridge 3Facebook AI Research
In the problems of image retrieval and few-shot classification, the mainstream approaches focus on learning a better feature representation. However, directly tackling the distance or similarity measure between images could also be efficient. To this end, we revisit the idea of re-ranking the top-k retrieved images in the context of image retrieval (e.g., the k-reciprocal nearest neighbors \cite{qin2011hello,zhong2017re}) and generalize this idea to transductive few-shot learning. We propose to meta-learn the re-ranking updates such that the similarity graph converges towards the target similairty graph induced by the image labels. Specifically, the re-ranking module takes as input an initial similarity graph between the query image and the contextual images using a pre-trained feature extractor, and predicts an improved similarity graph by leveraging the structure among the involved images. We show that our re-ranking approach can be applied to unseen images and can further boost existing approaches for both image retrieval and few-shot learning problems. Our approach operates either independently or in conjunction with classical re-ranking approaches, yielding clear and consistent improvements on image retrieval (CUB, Cars, SOP, rOxford5K, rParis6K) and transductive few-shot classification (Mini-ImageNet, tiered-ImageNet and CIFAR-FS) benchmarks.
To cite our paper,
@inproceedings{shen2021reranking, title={Re-ranking for image retrieval and transductive few-shot classification}, author={Shen, Xi and Xiao, Yang and Hu, Shell Xu, and Sbai, Othman and Aubry, Mathieu}, booktitle={Conference on Neural Information Processing Systems (NeurIPS)}, year={2021} }
This work was supported in part by ANR project EnHerit ANR-17-CE23-0008, project Rapid Tabasco, and IDRIS under the allocation AD011011160R1 made by GENCI.