Abstract
AdaBoost.M1 has been successfully applied to improve the accuracy of a learning algorithm for multi-class classification problems. However, it may be hard to satisfy the required conditions in some practical cases. An improved algorithm called AdaBoost.MK is developed to solve this problem. Early proposed support vector machines (SVM)-based, multi-class classification algorithms work by splitting the original problem into a set of two-class subproblems. The amount of time and space required by these algorithms is very demanding. We develop a multi-class classification algorithm by incorporating one-class SVMs with a well-designed discriminant function. Finally, a hybrid method integrating AdaBoost.MK and one-class SVMs is proposed to solve multi-class classification problems. Experimental results on data sets from UCI and Statlog show that the proposed approach outperforms other multi-class algorithms, such as support vector data descriptions (SVDDs) and AdaBoost.M1 with one-class SVMs, and the improvement is found to be statistically significant.
Acknowledgments
This work was supported by the National Science Council under grant NSC 95-2221-E-110-055-MY2.
The authors are grateful to the anonymous reviewer for the comments which were very helpful in improving the quality and presentation of this article.