216
Views
0
CrossRef citations to date
0
Altmetric
Research Article

An Adaptive Neural Network Regression Method for Structure Identification

, &
Received 02 Oct 2022, Accepted 06 Nov 2023, Published online: 03 Jan 2024

References

  • Antoniadis, A., Gijbels, I., and Verhasselt, A. (2012), “Variable Selection in Additive Models Using p-splines,” Technometrics, 54, 425–438. DOI: 10.1080/00401706.2012.726000.
  • Breiman, L. (1995), “Better Subset Regression Using the Nonnegative Garrote,” Technometrics, 37, 373–384. DOI: 10.1080/00401706.1995.10484371.
  • Breiman, L. (2001), “Random Forests,” Machine Learning, 45, 5–32.
  • Carreira-Perpinán, M. A. (1997), “A Review of Dimension Reduction Techniques,” Department of Computer Science. University of Sheffield. Tech. Rep. CS-96-09, 9, 1–69.
  • Chen, T., and Guestrin, C. (2016), “Xgboost: A Scalable Tree Boosting System,” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794.
  • De Boor, C. (1978), A Practical Guide to Splines (Vol. 27), New York: Springer-verlag.
  • Diamantaras, K. I., and Kung, S. Y. (1996), Principal Component Neural Networks: Theory and Applications, New York: Wiley.
  • Dua, D., and Graff, C. (2017), “UCI Machine Learning Repository.”
  • Feng, J., and Simon, N. (2017), “Sparse-Input Neural Networks for High-Dimensional Nonparametric Regression and Classification,” arXiv preprint arXiv:1711.07592.
  • Feng, J., and Simon, N. (2022), “Ensembled Sparse-Input Hierarchical Networks for High-Dimensional Datasets,” Statistical Analysis and Data Mining: The ASA Data Science Journal, 15, 736–750.
  • Friedman, J. H. (1991), “Multivariate Adaptive Regression Splines,” The Annals of Statistics, 19, 1–67. DOI: 10.1214/aos/1176347963.
  • Friedman, J. H., and Stuetzle, W. (1981), “Projection Pursuit Regression,” Journal of the American statistical Association, 76, 817–823. DOI: 10.1080/01621459.1981.10477729.
  • Gu, C., and Gu, C. (2013), Smoothing Spline ANOVA Models (Vol. 297), New York: Springer.
  • Hastie, T., and Tibshirani, R. (1987), “Generalized Additive Models: Some Applications,” Journal of the American Statistical Association, 82, 371–386. DOI: 10.1080/01621459.1987.10478440.
  • Hastie, T., Tibshirani, R., Friedman, J. H., and Friedman, J. H. (2009), The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Vol. 2), New York: Springer.
  • Hopfield, J. J. (1982), “Neural Networks and Physical Systems with Emergent Collective Computational Abilities,” Proceedings of the National Academy of Sciences, 79, 2554–2558. DOI: 10.1073/pnas.79.8.2554.
  • Insua, D. R., and Müller, P. (1998), “Feedforward Neural Networks for Nonparametric Regression,” in Practical Nonparametric and Semiparametric Bayesian Statistics, eds. D. Dey, P. Müller, and D. Sinha, pp. 181–193, New York: Springer.
  • Jhong, J.-H., Bak, K.-Y., Shin, J.-K., and Koo, J.-Y. (2021), “Additive Regression Splines with Total Variation and Non negative Garrote Penalties,” Communications in Statistics-Theory and Methods, 51, 7713–7736. DOI: 10.1080/03610926.2021.1879860.
  • Kingma, D. P., and Ba, J. (2014), “Adam: A Method for Stochastic Optimization,” arXiv preprint arXiv:1412.6980.
  • Kohler, M., and Krzyżak, A. (2005), “Adaptive Regression Estimation with Multilayer Feedforward Neural Networks,” Nonparametric Statistics, 17, 891–913. DOI: 10.1080/10485250500309608.
  • Lemhadri, I., Ruan, F., Abraham, L., and Tibshirani, R. (2021), “Lassonet: A Neural Network with Feature Sparsity,” The Journal of Machine Learning Research, 22, 5633–5661.
  • Lin, Y., and Zhang, H. H. (2006), “Component Selection and Smoothing in Multivariate Nonparametric Regression,” The Annals of Statistics, 34, 2272–2297. DOI: 10.1214/009053606000000722.
  • Loh, W.-Y. (2011), “Classification and Regression Trees,” Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 1, 14–23. DOI: 10.1002/widm.8.
  • Nakada, R., and Imaizumi, M. (2020), “Adaptive Approximation and Generalization of Deep Neural Network with Intrinsic Dimensionality,” Journal of Machine Learning Research, 21, 1–38.
  • Ravikumar, P., Lafferty, J., Liu, H., and Wasserman, L. (2009), “Sparse Additive Models,” Journal of the Royal Statistical Society, Series B, 71, 1009–1030. DOI: 10.1111/j.1467-9868.2009.00718.x.
  • Scardapane, S., Comminiello, D., Hussain, A., and Uncini, A. (2017), “Group Sparse Regularization for Deep Neural Networks,” Neurocomputing, 241, 81–89. DOI: 10.1016/j.neucom.2017.02.029.
  • Shin, J.-K., Bak, K.-Y., and Koo, J.-Y. (2022a), “Penalized Additive Neural Network Regression,” Intelligent Data Analysis, 26, 1597–1616. DOI: 10.3233/IDA-216070.
  • Shin, J.-K., Bak, K.-Y., and Koo, J.-Y. (2022b), “Sparse Neural Network Regression with Variable Selection,” Computational Intelligence, 38, 2075–2094.
  • Shin, J.-K., Jhong, J.-H., and Koo, J.-Y. (2021), “Stable Activation-based Regression with Localizing Property,” Communications for Statistical Applications and Methods, 28, 281–294. DOI: 10.29220/CSAM.2021.28.3.281.
  • Specht, D. F. (1991), “A General Regression Neural Network,” IEEE Transactions on Neural Networks, 2, 568–576. DOI: 10.1109/72.97934.
  • Stone, C. J. (1985), “Additive Regression and Other Nonparametric Models,” The Annals of Statistics, 13, 689–705. DOI: 10.1214/aos/1176349548.
  • Stone, C. J. (1986), “The Dimensionality Reduction Principle for Generalized Additive Models,” The Annals of Statistics, 14, 590–606.
  • Stone, C. J. (1994), “The Use of Polynomial Splines and their Tensor Products in Multivariate Function Estimation,” The Annals of Statistics, 22, 118–171.
  • Tibshirani, R. (1996), “Regression Shrinkage and Selection via the Lasso,” Journal of the Royal Statistical Society, Series B, 58, 267–288. DOI: 10.1111/j.2517-6161.1996.tb02080.x.
  • Tsanas, A., and Xifara, A. (2012), “Accurate Quantitative Estimation of Energy Performance of Residential Buildings Using Statistical Machine Learning Tools,” Energy and Buildings, 49, 560–567. DOI: 10.1016/j.enbuild.2012.03.003.
  • Wahba, G., Wang, Y., Gu, C., Klein, R., and Klein, B. (1995), “Smoothing Spline Anova for Exponential Families, with Application to the Wisconsin Epidemiological Study of Diabetic Retinopathy: The 1994 Neyman Memorial Lecture,” The Annals of Statistics, 23, 1865–1895. DOI: 10.1214/aos/1034713638.
  • Wang, H., and Kai, B. (2015), “Functional Sparsity: Global versus Local,” Statistica Sinica, 25, 1337–1354. DOI: 10.5705/ss.2014.035.
  • Yeh, I.-C., and Hsu, T.-K. (2018), “Building Real Estate Valuation Models with Comparative Approach through Case-based Reasoning,” Applied Soft Computing, 65, 260–271. DOI: 10.1016/j.asoc.2018.01.029.
  • Yuan, M., and Lin, Y. (2006), “Model Selection and Estimation in Regression with Grouped Variables,” Journal of the Royal Statistical Society, Series B, 68, 49–67. DOI: 10.1111/j.1467-9868.2005.00532.x.
  • Zou, H. (2006), “The Adaptive Lasso and its Oracle Properties,” Journal of the American Statistical Association, 101, 1418–1429. DOI: 10.1198/016214506000000735.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.