TalksAn information geometric perspective on active learning
reads
Chen-Hsiang Yeang
2010-04-24
16:20 - 17:20
308 , Mathematics Research Center Building (ori. New Math. Bldg.)
The Fisher information matrix plays a very important role in both active learning and information geometry. In a special case of active learning (nonlinear regression with Gaussian noise), the inverse of the Fisher information matrix -- the dispersion matrix of parameters -- induces a variety of criteria for optimal experiment design. In information geometry, the Fisher information matrix defines the metric tensor on model manifolds. In this paper, I explore the intrinsic relations of these two fields. The conditional distributions which belong to exponential families are known to be dually at. Moreover, the author proves for a certain type of conditional models, the embedding curvature in terms of true parameters also vanishes. The expected Riemannian distance between current parameters and the next update is proposed to be the loss function for active learning. Examples of nonlinear and logistic regressions are given in order to elucidate this active learning scheme.