Arrow Research search
Back to AAAI

AAAI 2023

Efficient Dynamic Batch Adaptation (Student Abstract)

Short Paper AAAI Student Abstract and Poster Program Artificial Intelligence

Abstract

In this paper we introduce Efficient Dynamic Batch Adaptation (EDBA), which improves on a previous method that works by adjusting the composition and the size of the current batch. Our improvements allow for Dynamic Batch Adaptation to feasibly scale up for bigger models and datasets, drastically improving model convergence and generalization. We show how the method is still able to perform especially well in data-scarce scenarios, managing to obtain a test accuracy on 100 samples of CIFAR-10 of 90.68%, while the baseline only reaches 23.79%. On the full CIFAR-10 dataset, EDBA reaches convergence in ∼120 epochs while the baseline requires ∼300 epochs.

Authors

Keywords

  • Few Shot Learning
  • Online Batch Selection
  • Optimization

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
527811277911418499