634
Views
15
CrossRef citations to date
0
Altmetric
Original Articles

Boosted Varying-Coefficient Regression Models for Product Demand Prediction

Pages 361-382 | Received 01 Jun 2012, Accepted 01 Feb 2013, Published online: 28 Apr 2014

REFERENCES

  • Aitchison, J., and Aitken, C. G.G. (1976), “Multivariate Binary Discrimination by the Kernel Method,” Biometrika, 13, 301–320.
  • Bartlett, P., and Traskin, M. (2007), “AdaBoost is Consistent,” in Advances in Neural Information Processing Systems 19, eds. B. Schölkopf, J. Platt, and T. Hoffman, Cambridge, MA: MIT Press, pp. 105–112.
  • Breiman, L. (1996), “Bagging Predictors,” Machine Learning, 24, 123–140.
  • ——— (1998), “Arcing Classifiers” (with discussion), The Annals of Statistics, 26, 801–849.
  • ——— (1999), “Prediction Games and Arcing Algorithms,” Neural Computation, 11, 1493–1517.
  • ——— (2001), “Random Forest,” Machine Learning, 45, 5–32.
  • Breiman, L., Friedman, J., Olshen, R., and Stone, C. (1984), Classification and Regression Trees, New York: Wadsworth.
  • Bühlmann, P. (2006), “Boosting for High-Dimensional Linear Models,” Journal of the American Statistical Association, 34, 559–583.
  • Bühlmann, P., and Hothorn, T. (2007), “Boosting Algorithms: Regularization, Prediction and Model Fitting” (with discussion), Statistical Science, 22, 477–505.
  • Bühlmann, P., and Yu, B. (2003), “Boosting With the l2 Loss: Regression and Classification,” Journal of the American Statistical Association, 2298, 324–339.
  • Chipman, H., George, E., and McCulloch, R. (1998), “Bayesian CART Model Search,” Journal of the American Statistical Association, 93, 935–960.
  • Chipman, H., George, E.I., and McCulloch, R.E. (2002), “Bayesian Treed Models,” Machine Learning, 48, 299–320.
  • Cleveland, W., Grosse, E., and Shyu, W. (1992), “Local Regression Models,” in Statistical Models in S., eds. J. M. Chambers and T. J. Hastie, Pacific Grove, CA: Wadsworth & Brooks, pp. 309–376.
  • Denison, D. G.T., Mallick, B.K., and Smith, A. F.M. (1998), “A Bayesian CART Algorithm,” Biometrika, 85, 363–377.
  • Fan, G., and Gray, J.B. (2005), “Regression Tree Analysis Using TARGET,” Journal of Computational and Graphical Statistics, 14, 1–13.
  • Fan, J., and Gijbels, I. (2003), Local Polynomial Modelling and its Applications, London: Chapman and Hall.
  • Fan, J., and Zhang, W. (2008), “Statistical Methods With Varying Coefficient Models,” Statistics and Its Interface, 1, 179–195.
  • Fisher, W. (1958), “On Grouping for Maximal Homogeneity,” Journal of the American Statistical Association, 53, 789–798.
  • Freund, Y., and Schapire, R. (1996), “Experiments With a New Boosting Algorithm,” in Proceedings of the Thirteenth International Conference on Machine Learning, San Francisco, CA: Morgan Kaufmann Publishers Inc., pp. 148–156.
  • ——— (1997), “A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting,” Journal of Computer and System Sciences, 55, 119–139.
  • Friedman, J.H. (2001), “Greedy Function Approximation: A Gradient Boosting Algorithm,” The Annals of Statistics, 29, 1189–1232.
  • Goldberg, D. (1989), Genetic Algorithms in Search, Optimization and Machine Learning, Reading, MA: Addison-Wesley.
  • Hastie, T., Tibshirani, R., and Friedman, J. (2009), The Elements of Statistical Learning: Data Mining, Inference, and Prediction, New York: Springer-Verlag.
  • Hastie, T., and Tibshirani, R.J. (1993), “Varying-Coefficient Models,” Journal of the Royal Statistical Society, Series B, 55, 757–796.
  • Jiang, W. (2004), “Process Consistency for AdaBoost,” The Annals of Statistics, 32, 13–29.
  • Logosi, G., and Vayatis, N. (2004), “On the Bayes-Risk Consistency of Regularized Boosting Methods,” The Annals of Statistics, 32, 30–55.
  • Loh, W.-Y. (1997), “Regression Trees With Unbiased Variable Selection and Interaction Detection,” Statistica Sinica, 12, 361–386.
  • Loh, W.-Y., and Kim, H. (2001), “Boosting Classification Trees With Unbiased Multiway Splits,” Journal of the American Statistical Association, 96, 589–604.
  • Loh, W.-Y., and Shih, Y.-S. (1997), “Split Selection Methods for Classification Trees,” Statistica Sinica, 7, 815–840.
  • McCullagh, P., and Nelder, J.A. (1989), Generalized Linear Models, Boca Raton, FL: Chapman & Hall.
  • McFadden, D. (1974), “Conditional Logit Analysis of Qualitative Choice Behavior,” in Frontiers in Econometrics, ed. P. Zarembka, New York: Academic Press, pp. 105–142.
  • Morgan, J.N., and Sonquist, J.A. (1963), “Problems in the Analysis of Survey Data, and a Proposal,” Journal of the American Statistical Association, 58, 415–434.
  • Papagelis, A., and Kalles, D. (2001), “Breeding Decision Trees Using Evolutionary Techniques,” in Conference Proceedings ICML ’01, Williamstown, USA, pp. 393–400.
  • Quinlan, J.R. (1993), C4.5: Programs for Machine Learning, San Mateo, CA: Morgan Kaufman.
  • Tibshirani, R. (1996), “Regression Shrinkage and Selection Via LASSO,” Journal of the Royal Statistical Society, Series B, 58, 267–288.
  • Zhang, T., and Yu, B. (2005), “Boosting With Early Stopping: Convergence and Consistency,” The Annals of Statistics, 33, 1538–1579.
  • Zou, H., and Hastie, T. (2005), “Regularization and Variable Selection Via the Elastic Net,” Journal of the Royal Statistical Society, Series B, 67, 301–320.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.