Not Logged In

Temporal Abstraction in Temporal-Difference Networks

Full Text: NIPS2005_0261.pdf PDF

Temporal-difference (TD) networks have been proposed as a way of representing and learning a wide variety of predictions about the interaction between an agent and its environment (Sutton & Tanner, 2005). These predictions are compositional in that their targets are defined in terms of other predictions, and subjunctive in that they are about what would happen if an action or sequence of actions were taken. In conventional TD networks, the inter-related predictions are at successive time steps and contingent on a single action; here we generalize them to accommodate extended time intervals and contingency on whole ways of behaving. Our generalization is based on the options framework for temporal abstraction (Sutton, Precup & Singh, 1999). The primary contribution of this paper is to introduce a new algorithm for intra-option learning in TD networks with function approximation and eligibility traces. We present empirical examples of our algorithm's effectiveness and of the greater representational expressiveness of temporally-abstract TD networks.

Citation

R. Sutton, E. Rafols, A. Koop. "Temporal Abstraction in Temporal-Difference Networks". Neural Information Processing Systems (NIPS), Vancouver, British Columbia, Canada, January 2005.

Keywords: empirical, generalization, framework, machine learning
Category: In Conference

BibTeX

@incollection{Sutton+al:NIPS05,
  author = {Richard S. Sutton and Eddie J. Rafols and Anna Koop},
  title = {Temporal Abstraction in Temporal-Difference Networks},
  booktitle = {Neural Information Processing Systems (NIPS)},
  year = 2005,
}

Last Updated: April 24, 2007
Submitted by AICML Admin Assistant

University of Alberta Logo AICML Logo