Abstract
Based on a Gaussian mixture type model of K components, we derive eigen selection procedures that improve the usual spectral clustering algorithms in high-dimensional settings, which typically act on the top few eigenvectors of an affinity matrix (e.g., ) derived from the data matrix . Our selection principle formalizes two intuitions: (i) eigenvectors should be dropped when they have no clustering power; (ii) some eigenvectors corresponding to smaller spiked eigenvalues should be dropped due to estimation inaccuracy. Our selection procedures lead to new spectral clustering algorithms: ESSC for K = 2 and GESSC for K > 2. The newly proposed algorithms enjoy better stability and compare favorably against canonical alternatives, as demonstrated in extensive simulation and multiple real data studies. Supplementary materials for this article are available online.
Supplementary Material
The supplementary material includes technical lemmas and proofs.
Notes
1 A comparison with one alternative affinity matrix construction is given in Section 3.2.