TalksInformation divergence class and robust statistical methods II
reads
Shinto Eguchi
2010-04-24
11:00 - 12:00
308 , Mathematics Research Center Building (ori. New Math. Bldg.)
Information geometry is a dual Riemannian geometry in a narrow sense. This aims to elucidate a dualistic structure associated with model and inference in statistics. The dual connections, exponential-connection and mixture-connection provide the most natural lines in modeling and decision making, in which the mean of dual connections reduces to Riemannian connection with respect to the metric defined by Fisher information. We observe that Kullback-Leibler divergence associates with this duality, in which maximum likelihood and exponential model interplay in an idealistic manner with minimal sufficiency, efficiency, invariance and so forth. We overview the basic idea and extend it to a subsequent discussion.
For material related to this talk, click here.