37
Views
9
CrossRef citations to date
0
Altmetric
Original Articles

On implicit Lagrangian twin support vector regression by Newton method

&
Pages 50-64 | Received 05 Feb 2012, Accepted 09 Jun 2013, Published online: 27 Nov 2013

References

  • Osuna, E., Freund, R., & Girosi, F. (1997). Training support vector machines: An application to face detection, in Proceedings of Computer Vision and Pattern Recognition, 130–136.
  • Brown M.P.S., Grundy W.N., & Lin, D. (2000). Knowledge-based analysis of microarray gene expression data using support vector machine, Proceedings of the National Academy of Sciences of USA, 97(1), 262–267.
  • Chen, S., & Wang, M. (2005). Seeking multi-threshold directly from support vectors for image segmentation, Neurocomputing, 67, 335–344.
  • Mukherjee, S., Osuna, E., & Girosi, F. (1997). Nonlinear prediction of chaotic time series using support vector machines, in: NNSP'97: Neural Networks for Signal Processing VII: in Proceedings of IEEE Signal Processing Society Workshop, Amelia Island, FL, USA,, 511–520.
  • Muller, K.R., Smola, A.J., Ratsch, G., Schölkopf, B., & Kohlmorgen, J. (1999). Using support vector machines for time series prediction, in: B.Schölkopf, C.J.C.Burges, A.J.Smola (Eds.), Advances in Kernel Methods-Support Vector Learning, MIT Press, Cambridge, MA, 243–254.
  • Cristianini, N., & Shawe-Taylor, J. (2000). An introduction to support vector machines and kernel based learning methods, Cambridge University Press.
  • Vapnik, V.N. (2000). The nature of statistical learning theory, 2nd Edition, Springer, New York.
  • Mangasarian, O.L., & Wild, E.W. (2006). Multisurface proximal support vector classification via generalized eigenvalues, IEEE Transactions on Pattern Analysis and Machine Intelligence 28(1), pp69–74.
  • Jayadeva, Khemchandani R., & Chandra, S. (2007). Twin support vector machines for pattern classification, IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(5), 905–910.
  • Kumar M.A., & Gopal, M. (2008). Application of smoothing technique on twin support vector machines, Pattern Recognition Letter, 29, 1842–1848.
  • Kumar M.A., & Gopal, M. (2009). Least squares twin support vector machines for pattern classification, Expert System with Applications, 36, 7535–7543.
  • Suykens, J.A.K., Vandewalle, J., & Moor, B.D. (2001). Optimal control by least squares support vector machines, Neural Networks, 14(1), 23–25.
  • Balasundaram, S., & Kapil (2010). On Lagrangian support vector regression, Expert Systems with Applications, 37, 8784–8792.
  • Balasundaram, S., & Rampal Singh (2010). On finite Newton method for support vector regression, Neural Computing & Applications, 19, 967–977.
  • Lee, Y.J., Hsieh, W.-F., & Huang, C.-M. (2005). ε -SSVR: A smooth support vector machine for ε - insensitive regression, IEEE Transactions on Knowledge and Data Engineering, 17(5), 678–685.
  • Musicant, D.R., & Feinberg, A. (2004). Active set support vector regression, IEEE Transactions on Neural Networks, 15(2), 268–275.
  • Peng, X. (2010). TSVR: An efficient Twin Support Vector Machine for regression, Neural Networks, 23(3), 365–372.
  • Chen X., Yang J., Liang J., & Ye, Q. (2012). Smooth twin support vector regression, Neural Computing & Applications, 21, 505–513.
  • Zhong, P., Xu, Y., & Zhao, Y. (2012). Training twin support vector regression via linear programming, Neural Computing & Applications, 21, 399–407.
  • Fung, G., & Mangasarian, O.L. (2003). Finite Newton method for Lagrangian support vector machine, Neurocomputing, 55, 39–55.
  • Balasundaram, S., & Kapil (2011). Finite Newton method for implicit Lagrangian support vector regression, International Journal of Knowledge based Intelligent Engineering Systems, 15, 203–214.
  • Mangasarian, O.L., & Musicant, D.R. (2001). Lagrangian support vector machines, Journal of Machine Learning Research, 1, 161–177.
  • Mangasarian, O.L. (1994). Nonlinear programming, SIAM Philadelphia, PA.
  • Mangasarian, O.L., & Solodov, M.V. (1993). Nonlinear complementarity as unconstrained and constrained minimization, Mathematical Programming, Series B, 62, 277–297.
  • Hiriart-Urruty, J.-B., Strodiot, J.J., & Nguyen, H. (1984). Generalized Hessian matrix and second order optimality conditions for problems with CL1 data, Applied Mathematics and Optimization, 11, 43–56.
  • Li, G., Wen, C., Huang, G.-B., & Chen, Y. (2011). Error tolerance based support vector machine for regression, Neurocomputing, 74(5), 771–782.
  • Box, G.E.P., & Jenkins, G.M. (1976). Time series analysis: Forecasting and Control, Holden-Day, San Francisco.
  • Wang, X.X, Chen, S., Lowe, D., & Harris, C.J. (2006). Sparse support vector regression based on orthogonal forward selection for generalized kernel model, Neurocomputing, 70, 462–474.
  • Casdagli, M. (1989). Nonlinear prediction of chaotic time series, Physica D, 35, 335–356.
  • Murphy, P.M., & Aha, D.W. (1992). UCI repository of machine learning databases. University of California, Irvine. http://www.ics.uci.edu/∼mlearn
  • DELVE, (2005). Data for Evaluating Learning in Valid Experiments, http://www.cs.toronto.edu/∼delve/data

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.