Self-organizing mixture networks for probability density estimation

Yin, Hujun and Allinson, Nigel (2001) Self-organizing mixture networks for probability density estimation. IEEE Transactions on Neural Networks, 12 (2). pp. 405-411. ISSN 1045-9227

Full content URL: http://ieeexplore.ieee.org/xpl/articleDetails.jsp?...

Full text not available from this repository.

Item Type:Article
Item Status:Live Archive

Abstract

A self-organizing mixture network (SOMN) is derived for learning arbitrary density functions. The network minimizes the Kullback-Leibler information metric by means of stochastic approximation methods. The density functions are modeled as mixtures of parametric distributions A mixture needs not to be homogenous, i.e., it can have different density profiles. The first layer of the network is similar to Kohonen's self-organizing map (SOM), but with the parameters of the component densities as the learning weights. The winning mechanism is based on maximum posterior probability, and updating of the weights is limited to a small neighborhood around the winner. The second layer accumulates the responses of these local nodes, weighted by the learned mixing parameters. The network possesses a simple structure and computational form, yet yields fast and robust convergence. The network has a generalization ability due to the relative entropy criterion used. Applications to density profile estimation and pattern classification are presented. The SOMN can also provide an insight to the role of neighborhood function used in the SOM.

Keywords:Algorithms, Approximation theory, Learning systems, Pattern recognition, Probability density function, Random processes, Expectation-maximization (EM) algorithms, Self-organizing mixture networks (SOMN), Self organizing maps
Subjects:G Mathematical and Computer Sciences > G400 Computer Science
Divisions:College of Science > School of Computer Science
ID Code:8579
Deposited On:26 Apr 2013 10:17

Repository Staff Only: item control page