FEW-SHOT IMAGE CLASSIFICATION VIA MUTUAL DISTILLATION

Few-Shot Image Classification via Mutual Distillation

Few-Shot Image Classification via Mutual Distillation

Blog Article

Due to their compelling performance and appealing simplicity, metric-based meta-learning approaches are gaining increasing attention for addressing the challenges of few-shot image classification.However, many similar methods employ intricate network architectures, which can potentially lead to overfitting when trained with limited samples.To tackle this VITAMIN E 400IU concern, we propose using mutual distillation to enhance metric-based meta-learning, effectively bolstering model generalization.Specifically, our approach involves two DISH WASHING LIQUID individual metric-based networks, such as prototypical networks and relational networks, mutually supplying each other with a regularization term.

This method seamlessly integrates with any metric-based meta-learning approach.We undertake comprehensive experiments on two prevalent few-shot classification benchmarks, namely miniImageNet and Caltech-UCSD Birds-200-2011 (CUB), to demonstrate the effectiveness of our proposed algorithm.The results demonstrate that our method efficiently enhances each metric-based model through mutual distillation.

Report this page