Not Logged In

Meta-descent for Online, Continual Prediction

This paper investigates different vector step-size adaptation approaches for non-stationary online, continual prediction problems. Vanilla stochastic gradient descent can be considerably improved by scaling the update with a vector of appropriately chosen step-sizes. Many methods, including AdaGrad, RMSProp, and AMSGrad, keep statistics about the learning process to approximate a second order update—a vector approximation of the inverse Hessian. Another family of approaches use meta-gradient descent to adapt the stepsize parameters to minimize prediction error. These metadescent strategies are promising for non-stationary problems, but have not been as extensively explored as quasi-second order methods. We first derive a general, incremental metadescent algorithm, called AdaGain, designed to be applicable to a much broader range of algorithms, including those with semi-gradient updates or even those with accelerations, such as RMSProp. We provide an empirical comparison of methods from both families. We conclude that methods from both families can perform well, but in non-stationary prediction problems the meta-descent methods exhibit advantages. Our method is particularly robust across several prediction problems, and is competitive with the state-of-the-art method on a large-scale, time-series prediction problem on real data from a mobile robot.

Citation

A. Jacobsen, M. Schlegel, C. Linke, T. Degris, A. White, M. White. "Meta-descent for Online, Continual Prediction". National Conference on Artificial Intelligence (AAAI), pp 3943-3950, January 2019.

Keywords:  
Category: In Conference
Web Links: DOI
  AAAI

BibTeX

@incollection{Jacobsen+al:AAAI19,
  author = {Andrew Jacobsen and Matthew Schlegel and Cameron Linke and Thomas
    Degris and Adam White and Martha White},
  title = {Meta-descent for Online, Continual Prediction},
  Pages = {3943-3950},
  booktitle = {National Conference on Artificial Intelligence (AAAI)},
  year = 2019,
}

Last Updated: February 25, 2020
Submitted by Sabina P

University of Alberta Logo AICML Logo