Arrow Research search

Author name cluster

Jeff Johns

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

3 papers
1 author row

Possible papers

3

AAAI Conference 2007 Conference Paper

Compact Spectral Bases for Value Function Approximation Using Kronecker Factorization

  • Jeff Johns

A new spectral approach to value function approximation has recently been proposed to automatically construct basis functions from samples. Global basis functions called proto-value functions are generated by diagonalizing a diffusion operator, such as a reversible random walk or the Laplacian, on a graph formed from connecting nearby samples. This paper addresses the challenge of scaling this approach to large domains. We propose using Kronecker factorization coupled with the Metropolis-Hastings algorithm to decompose reversible transition matrices. The result is that the basis functions can be computed on much smaller matrices and combined to form the overall bases. We demonstrate that in several continuous Markov decision processes, compact basis functions can be constructed without significant loss in performance. In one domain, basis functions were compressed by a factor of 36. A theoretical analysis relates the quality of the approximation to the spectral gap. Our approach generalizes to other basis constructions as well.

AAAI Conference 2006 Conference Paper

A Dynamic Mixture Model to Detect Student Motivation and Proficiency

  • Jeff Johns

Unmotivated students do not reap the full rewards of using a computer-based intelligent tutoring system. Detection of improper behavior is thus an important component of an online student model. To meet this challenge, we present a dynamic mixture model based on Item Response Theory. This model, which simultaneously estimates a student’s proficiency and changing motivation level, was tested with data of high school students using a geometry tutoring system. By accounting for student motivation, the dynamic mixture model can more accurately estimate proficiency and the probability of a correct response. The model’s generality is an added benefit, making it applicable to many intelligent tutoring systems as well as other domains.

AAAI Conference 2005 Conference Paper

A Variational Learning Algorithm for the Abstract Hidden Markov Model

  • Jeff Johns

We present a fast algorithm for learning the parameters of the abstract hidden Markov model, a type of hierarchical activity recognition model. Learning using exact inference scales poorly as the number of levels in the hierarchy increases; therefore, an approximation is required for large models. We demonstrate that variational inference is well suited to solve this problem. Not only does this technique scale, but it also offers a natural way to leverage the context specific independence properties inherent in the model via the fixed point equations. Experiments confirm that the variational approximation significantly reduces the time necessary for learning while estimating parameter values that can be used to make reliable predictions.