262
Views
4
CrossRef citations to date
0
Altmetric
Original Articles

Divergence Functions of Non negative Matrix Factorization: A Comparison Study

Pages 1594-1612 | Received 01 Jul 2010, Accepted 01 Mar 2010, Published online: 09 Aug 2011
 

Abstract

Non negative Matrix Factorization (NMF) has become one of the most popular models in data mining for its good performance in unsupervised learning applications. Recently, a variety of divergence functions have been extensively studied for NMF. But, there is still lack of analysis on the relationships between the divergence functions and the applications. This article tries to give some preliminary results on this interesting problem. Our experiments show that the most familiar two divergence functions—the least squares error and the K-L divergence—are competent for unsupervised learning such as gene expression data clustering and image processing.

Mathematics Subject Classification:

Acknowledgment

The author is very appreciated for the reviewers’ valuable comments. This work is supported by the Foundation of Academic Discipline Program at Central University of Finance and Economics, P. R. China.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,090.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.