262
Views
4
CrossRef citations to date
0
Altmetric
Original Articles

Divergence Functions of Non negative Matrix Factorization: A Comparison Study

Pages 1594-1612 | Received 01 Jul 2010, Accepted 01 Mar 2010, Published online: 09 Aug 2011
 

Abstract

Non negative Matrix Factorization (NMF) has become one of the most popular models in data mining for its good performance in unsupervised learning applications. Recently, a variety of divergence functions have been extensively studied for NMF. But, there is still lack of analysis on the relationships between the divergence functions and the applications. This article tries to give some preliminary results on this interesting problem. Our experiments show that the most familiar two divergence functions—the least squares error and the K-L divergence—are competent for unsupervised learning such as gene expression data clustering and image processing.

Mathematics Subject Classification:

Acknowledgment

The author is very appreciated for the reviewers’ valuable comments. This work is supported by the Foundation of Academic Discipline Program at Central University of Finance and Economics, P. R. China.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.