ServiceNow Research

Reproducibility and Stability Analysis in Metric-Based Few-Shot Learning

Abstract

We propose a study of the stability of several few-shot learning algorithms subject to variations in the hyper-parameters and optimization schemes while controlling the random seed. We propose a methodology for testing for statistical differences in model performances under several replications. To study this specific design, we attempt to reproduce results from three prominent papers: Matching Nets, Prototypical Networks, and TADAM. We analyze on the miniImagenet dataset on the standard classification task in the 5-ways, 5-shots learning setting at test time. We find that the selected implementations exhibit stability across random seed, and repeats.

Publication
Workshop at the International Conference on Learning Representations (ICLR)
Denis Kocetkov
Denis Kocetkov
AI Developer

AI Developer at Large Language Models Lab located at London, United Kingdom.