References
- Amezcua, J., P. Melin, and O. Castillo. 2015. Design of an optimal modular LVQ network for classification of arrhythmias based on a variable training-test datasets strategy. Intelligent Systems 323:369–75.
- Arthur, D., and S. Vassilvitskii. 2007. K-means++: The advantages of careful seeding. Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms. Society for Industrial and Applied Mathematics 11 (6):1027–35.
- Bhatt, R., and A. Dhall. 2012. Skin Segmentation Dataset. UCI machine learning repository. http://archive.ics.uci.edu/ml/datasets/skin+segmentation.
- Bock, M. E. 1975. Minimax estimators of the mean of a multivariate normal distribution. The Annals of Statistics 46 (3):209–18. doi:https://doi.org/10.1214/aos/1176343009.
- Cuesta-Albertos, J. A., A. Gordaliza, and C. Matrán. 1997. Trimmed k-means: An attempt to robustify quantizers. The Annals of Statistics 25 (2):553–76. doi:https://doi.org/10.1214/aos/1031833664.
- Cuesta-Albertos, J. A., C. Matrn, and A. Mayo-Iscar. 2008. Robust estimation in the normal mixture model based on robust clustering. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 70 (4):779–802. doi:https://doi.org/10.1111/j.1467-9868.2008.00657.x.
- Damasceno, F. F., M. B. Veras, D. P. Mesquita, G. João P.P, and C. E. Brito. 2016. Shrinkage k-means: A clustering algorithm based on the James-Stein estimator. Intelligent Systems (BRACIS), 2016 5th Brazilian Conference on IEEE, 433–7.
- Dinler, D., and M. K. Tural. 2017. Robust semi-supervised clustering with polyhedral and circular uncertainty. Neurocomputing 265:4–27. doi:https://doi.org/10.1016/j.neucom.2017.04.073.
- Dua, D., and C. Graff. 2017. UCI machine learning repository. https://archive.ics.uci.edu/ml/datasets/banknote+authentication.
- Gao, J., and D. B. Hitchcock. 2010. James-Stein shrinkage to improve k-means cluster analysis. Computational Statistics & Data Analysis 54 (9):2113–27. doi:https://doi.org/10.1016/j.csda.2010.03.018.
- García-Escudero, L. A., A. Gordaliza, R. San Martin, S. Van Aelst, and R. Zamar. 2009. Robust linear clustering. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 71 (1):301–18. doi:https://doi.org/10.1111/j.1467-9868.2008.00682.x.
- Howe, J. A. 2017. Improved clustering with augmented k-means. arXiv preprint arXiv:1705.07592.
- Hubert, L., and P. Arabie. 1985. Comparing partitions. Journal of Classification 2 (1):193–218. doi:https://doi.org/10.1007/BF01908075.
- Inokuchi, R., and S. Miyamoto. 2004. LVQ clustering and SOM using a kernel function. Fuzzy Systems, 2004. Proceedings. 2004 IEEE International Conference 3:1497–500.
- Jain, A. K., and R. C. Dubes. 1988. Algorithms for clustering data. Upper Saddle River, NJ: Prentice-Hall, Inc.
- Jain, A. K., M. N. Murty, and P. J. Flynn. 1999. Data clustering: A review. ACM Computing Surveys 31 (3):264–323. doi:https://doi.org/10.1145/331499.331504.
- James, W., and C. Stein. 1961. Estimation with quadratic loss. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability 1:361–79.
- Kaufman, L., and P. Rousseeuw. 1987. Clustering by means of medoids. New York, NY: North-Holland.
- Kohonen, T. 1986. Learning vector quantization for pattern recognition. Report TKK-FA 601, Helsinki Univ. of Tech.
- Kohonen, T. 1995. Learning vector quantization. Self-Organizing Maps 30:175–89.
- Krishna, K., and M. Murty. 1999. Genetic K-Means algorithm. IEEE Transactions on Systems, Man, and Cybernetics. Part B, Cybernetics: a Publication of the IEEE Systems, Man, and Cybernetics Society 29 (3):433–9. doi:https://doi.org/10.1109/3477.764879.
- Lehmann, E. L., and G. Casella. 2006. Theory of point estimation. New York, NY: Springer Science & Business Media.
- MacQueen, J. 1967. Some methods for classification and analysis of multivariate observations. Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability 1:281–97.
- Martín-Valdivia, M. T., L. A. Ureña-López, and M. García-Vega. 2007. The learning vector quantization algorithm applied to automatic text classification tasks. Neural Networks 20 (6):748–56. doi:https://doi.org/10.1016/j.neunet.2006.12.005.
- Oliveira, G. V., F. P. Coutinho, R. J. Campello, and M. C. Naldi. 2017. Improving k-means through distributed scalable meta heuristics. Neurocomputing 246:45–57. doi:https://doi.org/10.1016/j.neucom.2016.07.074.
- Podržaj, P., and A. Čebular. 2016. The application of LVQ neural network for weld strength evaluation of RF-Welded plastic materials. IEEE/ASME Transactions on Mechatronics 21 (2):1063–71.
- Rand, W. M. 1971. Objective criteria for the evaluation of clustering methods. Journal of the American Statistical Association 66 (336):846–50. doi:https://doi.org/10.1080/01621459.1971.10482356.
- Richards, J. A. 1999. An introduction to James-Stein estimation. http://ssg.mit.edu/group/alumni/johnrich/docs/jse.ps.gz.
- Saxena, A., M. Prasad, A. Gupta, N. Bharill, O. P. Patel, A. Tiwari, M. J. Er, W. Ding, and C.-T. Lin. 2017. A review of clustering techniques and developments. Neurocomputing 267:664–81. doi:https://doi.org/10.1016/j.neucom.2017.06.053.
- Yoder, J., and C. E. Priebe. 2017. Semi-supervised k-means++. Journal of Statistical Computation and Simulation 87 (13):2597–608. doi:https://doi.org/10.1080/00949655.2017.1327588.