Loading [a11y]/accessibility-menu.js
Adaptation: Blessing or Curse for Higher Way Meta-Learning | IEEE Journals & Magazine | IEEE Xplore
Impact Statement:ML has become increasingly popular for learning from less data. However, the evaluation of ML approaches has often been limited to simple tasks with few classes. Our stud...Show More

Abstract:

The prevailing literature typically assesses the effectiveness of meta-learning (ML) approaches on tasks that involve no more than 20 classes. However, we challenge this ...Show More
Impact Statement:
ML has become increasingly popular for learning from less data. However, the evaluation of ML approaches has often been limited to simple tasks with few classes. Our study introduces a realistic setup, highlighting the need to reevaluate the setups used to establish superiority of the ML approaches as their performance may vary across setups. Moreover, ML approaches have been treated as black boxes. We examine the operation of fundamental ML approaches in the realistic setting and reveal certain previously unknown insights. We find that the impact of adaptation varies with respect to the ML algorithms and plays a crucial role in withstanding complex tasks. Overall, our study highlights the need for a more realistic setup for evaluating ML approaches and provides valuable insights into the adaptation mechanism common to most ML approaches. These insights can guide the development of effective adaptation strategies for more realistic settings in the future.

Abstract:

The prevailing literature typically assesses the effectiveness of meta-learning (ML) approaches on tasks that involve no more than 20 classes. However, we challenge this convention by conducting a more complex and natural task setup to test the fundamental initialization, metric, and optimization approaches. In particular, we increase the number of classes in the Omniglot and tieredImagenet datasets to 200 and 90, respectively. Interestingly, we observe that as the number of classes increases, ML approaches perform in reverse order of their degree of adaptation, with prototypical network (ProtoNet) outperforming almost no inner loop (ANIL) and model-agnostic meta-learning (MAML). ProtoNet, which does not require adaptation, is marginally affected by the increase in task complexity, while ANIL and MAML are highly affected. Despite performing full feature backbone and classifier adaptation, Meta Long Short-term Memory (MetaLSTM++) exhibits an intriguing behavior of performing well. To th...
Published in: IEEE Transactions on Artificial Intelligence ( Volume: 5, Issue: 4, April 2024)
Page(s): 1844 - 1856
Date of Publication: 04 August 2023
Electronic ISSN: 2691-4581

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.