MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

 

Unsupervised and Supervised Clustering: the Mutual Information Between Parameters and Observations

 Didier Herschkowitz and Jean-Pierre Nadal
  
 

Abstract:
Recent works in parameter estimation and neural coding have demonstrated that optimal performance are related to the mutual information between parameters and data. We study this mutual information for a family of supervised and unsupervised learning tasks. More precisely we consider the case where the dependency in the parameter of the conditional probability distribution of each observation is through their scalar product only, the parameter and the observations being vectors in a possibly high dimensional space.

We derive exact bounds and exact asymptotic behaviours for the mutual information as function of the data size and of some properties of the probability of the data given the parameter. We study also the behaviour of the mutual information as predicted by replica calculations. Finally we discuss the universal properties of the mutual information especially in the limit of large data size.

 
 


© 2010 The MIT Press
MIT Logo