DeFRCN: Decoupled Quicker R-CNN for Few-Shot Object Detection
Authors: Limeng Qiao, Yuxuan Zhao, Zhiyuan Li, Xi Qiu, Jianan Wu, Chi Zhang
Summary: Few-shot object detection, which goals at detecting novel objects quickly from extraordinarily few annotated examples of beforehand unseen lessons, has attracted important analysis curiosity in the neighborhood. Most present approaches make use of the Quicker R-CNN as fundamental detection framework, but, as a result of lack of tailor-made issues for data-scarce situation, their efficiency is commonly not passable. On this paper, we glance carefully into the standard Quicker R-CNN and analyze its contradictions from two orthogonal views, particularly multi-stage (RPN vs. RCNN) and multi-task (classification vs. localization). To resolve these points, we suggest a easy but efficient structure, named Decoupled Quicker R-CNN (DeFRCN). To be concrete, we lengthen Quicker R-CNN by introducing Gradient Decoupled Layer for multi-stage decoupling and Prototypical Calibration Block for multi-task decoupling. The previous is a novel deep layer with redefining the feature-forward operation and gradient-backward operation for decoupling its subsequent layer and previous layer, and the latter is an offline prototype-based classification mannequin with taking the proposals from detector as enter and boosting the unique classification scores with extra pairwise scores for calibration. Intensive experiments on a number of benchmarks present our framework is remarkably superior to different present approaches and establishes a brand new state-of-the-art in few-shot literature