References
- Agresti, A. 2003. Categorical data analysis, vol. 482. New York: John Wiley & Sons.
- Alwan, L. C., and H. V. Roberts. 1988. Time-series modeling for statistical process control. Journal of Business & Economic Statistics 6:87–95.
- Box, G. E., and N. R. Draper. 2007. Response surfaces, mixtures, and ridge analyses, vol. 649. New York, NY: John Wiley & Sons.
- Box, G. E. P., and K. Wilson. 1951. On the experimental attainment of optimum conditions. Journal of the Royal Statistical Society: Series B (Methodological) 13 (1):1–38. doi: https://doi.org/10.1111/j.2517-6161.1951.tb00067.x.
- Breiman, L. 2001. Statistical modeling: the two cultures (with comments and a rejoinder by the author). Statistical Science 16 (3):199–231. doi: https://doi.org/10.1214/ss/1009213726.
- Bui, A., and D. W. Apley. 2018. Monitoring for changes in the nature of stochastic textured surfaces. Journal of Quality Technology 50 (4):363–78. doi: https://doi.org/10.1080/00224065.2018.1507559.
- Castillo, E., and B. M. Colosimo. 2011. Statistical shape analysis of experiments for manufacturing processes. Technometrics 53 (1):1–15. doi: https://doi.org/10.1198/TECH.2010.08194.
- Cohn, D., L. Atlas, and R. Ladner. 1994. Improving generalization with active learning. Machine Learning 15 (2):201–21. doi: https://doi.org/10.1007/BF00993277.
- Del Castillo, E. 2007. Process optimization: A statistical approach, vol. 105. New York, NY: Springer Science & Business Media.
- Del Castillo, E., and X. Zhao. 2020a. Industrial statistics and manifold data. Quality Engineering 32 (2):155–67. doi: https://doi.org/10.1080/08982112.2019.1641608.
- Del Castillo, E., and X. Zhao. 2020b. Statistical process monitoring for manifold data. In Wiley Stats-Ref: Statistics Reference Online. https://doi.org/https://doi.org/10.1002/9781118445112.
- Du, S. S., Y. Wang, X. Zhai, S. Balakrishnan, R. Salakhutdi-nov, and A. Singh. 2018. How many samples are needed to estimate a convolutional neural network? In Advances in neural information processing systems, 371–381.
- Eckles, D., B. Karrer, and J. Ugander. 2017. Design and analysis of experiments in networks: Reducing bias from interference. Journal of Causal Inference 5 (1):2017. doi: https://doi.org/10.1515/jci-2015-0021.
- Efron, B., and T. Hastie. 2016. Computer age statistical inference. New York, NY: Cambridge University Press.
- Fang, X., K. Paynabar, and N. Gebraeel. 2019. Image-based prognostics using penalized tensor regression. Technometrics 61 (3):369–84. doi: https://doi.org/10.1080/00401706.2018.1527727.
- Fang, X., H. Yan, N. Gebraeel, and K. Paynabar. 2021. Multi-sensor prognostics modeling for applications with highly incomplete signals. IISE Transactions 53 (5):597–613. doi: https://doi.org/10.1080/24725854.2020.1789779.
- Frazier, P. I. 2018. Bayesian optimization. In Recent advances in optimization and modeling of contemporary problems, 255–78. INFORMS. https://doi.org/https://doi.org/10.1287/educ.2018.0188
- Friedman, M., and L. Savage. 1947. Planning experiments seeking maxima. In Selected techniques of statistical analysis, 364–372. New York, NY: McGraw Hill.
- Gartner. 2021. Hype Cycle for Artificial Intelligence, 2021. Gartner Identifies Four Trends Driving Near-Term Artificial Intelligence Innovation. https://www.gartner.com/en/newsroom/press-releases/2021-09-07-gartner-identifies-four-trends-driving-near-term-artificial-intelligence-innovation
- Gebraeel, N. Z., M. A. Lawley, R. Li, and J. K. Ryan. 2005. Residual-life distributions from component degradation signals: A Bayesian approach. IiE Transactions 37 (6):543–57. doi: https://doi.org/10.1080/07408170590929018.
- Goodfellow, I., Y. Bengio, and A. Courville. 2016. Deep learning. Cambridge, MA: MIT press.
- Gramacy, R. B. 2020. Surrogates: Gaussian process modeling, design, and optimization for the applied sciences. Boca Raton, FL: Chapman and Hall/CRC.
- Hong, Y., M. Zhang, and W. Q. Meeker. 2018. Big data and reliability applications: The complexity dimension. Journal of Quality Technology 50 (2):135–49. doi: https://doi.org/10.1080/00224065.2018.1438007.
- Hastie, T., R. Tibshirani, and J. Friedman. 2013. The elements of statistical learning: Data mining, inference, and prediction, 2nd ed., Springer Series in Statistics. Springer.
- Hotelling, H. 1941. Experimental determination of the maximum of a function. The Annals of Mathematical Statistics 12 (1):20–45. 1941. doi: https://doi.org/10.1214/aoms/1177731784.
- Hussain, F., R. Hussain, and E. Hossain. 2021. Explainable artificial intelligence (XAI): An engineering perspective. arXiv preprint arXiv:2101.03613.
- Jones, D. R., M. Schonlau, and W. J. Welch. 1998. Efficient global optimization of expensive black-box functions. Journal of Global Optimization 13 (4):455–92. doi: https://doi.org/10.1023/A:1008306431147.
- Kiefer, J., and J. Wolfowitz. 1952. Stochastic estimation of the maximum of a regression function. The Annals of Mathematical Statistics 23 (3):462–6. doi: https://doi.org/10.1214/aoms/1177729392.
- Kim, M., J.-R. C. Cheng, and K Liu. 2021. An adaptive sensor selection framework for multisensor prognostics. Journal of Quality Technology. doi: https://doi.org/10.1080/00224065.2021.1960934.
- Kohavi, R., R. M. Henne, and D. Sommerfield. 2007. Practical guide to controlled experiments on the web: Listen to your customers not to the HiPPO. In Proceedings of the 13th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 959–67.
- Lian, J., L. Freeman, Y. Hong, and X. Deng. 2021. Robustness with respect to class imbalance in artificial intelligence classification algorithms. Journal of Quality Technology. doi: https://doi.org/10.1080/00224065.2021.1963200.
- Li, H., E. Del Castillo, and G. Runger. 2020. On active learning methods for manifold data. Test 29 (1):1–33. doi: https://doi.org/10.1007/s11749-019-00694-y.
- Li, W., F. Tsung, Z. Song, K. Zhang, and D. Xiang. 2021. Multi-sensor based landslide monitoring via transfer learning. Journal of Quality Technology. doi: https://doi.org/10.1080/00224065.2021.1960936.
- Lin, C.-Y. 2021. Forward stepwise random forest analysis for experimental designs. Journal of Quality Technology. doi: https://doi.org/10.1080/00224065.2020.1865853.
- Liu, X., and R. Pan. 2021. Boost-R: Gradient boosted trees for recurrence data. Journal of Quality Technology. doi: https://doi.org/10.1080/00224065.2021.1948373.
- Marcinkevičs, R., and J. E. Vogt. 2020. Interpretability and explainability: A machine learning zoo mini-tour. arXiv preprint arXiv:2012.01805.
- Meeker, W. Q., and L. A. Escobar. 1998. Statistical methods for reliability data. New York: Wiley.
- Murdoch, W. J., C. Singh, K. Kumbier, R. Abbasi-Asl, and B. Yu. 2019. Definitions, methods, and applications in interpretable machine learning. Proceedings of the National Academy of Sciences 116 (44):22071–80. doi: https://doi.org/10.1073/pnas.1900654116.
- Murphy, K. P. 2012. Machine learning: A probabilistic perspective. Cambridge, MA: MIT press.
- Myers, R. H., D. C. Montgomery, and C. M. Anderson-Cook. 2016. Response surface methodology: Process and product optimization using designed experiments. New York, NY: John Wiley & Sons.
- Newman, M. 2010. Networks. Oxford, UK: Oxford University Press.
- Pearl, J. 2009. Casualty, models, reasoning and inference, 2nd ed. New York, NY: Cambridge University Press.
- Peterson, J. J. 2004. A posterior predictive approach to multiple response surface optimization. Journal of Quality Technology 36 (2):139–53. doi: https://doi.org/10.1080/00224065.2004.11980261.
- Peterson, J., and E. del Castillo. 2021. Process optimization with multiple response variables: A predictive distribution approach with R and stan. forthcoming, Boca Raton, FL: CRC Press.
- Pronzato, L. 2000. Adaptive optimization and D-optimum experimental design. Annals of Statistics 28 (6):1743–61.
- Psarakis, S. 2011. The use of neural networks in statistical process control charts. Quality and Reliability Engineering International 27 (5):641–50. doi: https://doi.org/10.1002/qre.1227.
- Rasmussen, C. E., and., and C. K. I. Williams. 2006. Gaussian processes for machine learning. Cambridge, MA: The MIT Press.
- Reisi Gahrooei, M., H. Yan, K. Paynabar, and J. Shi. 2021. Multiple tensor-on-tensor regression: An approach for modeling processes with heterogeneous sources of data. Technometrics 63 (2):147–59. doi: https://doi.org/10.1080/00401706.2019.1708463.
- Robbins, H., and S. Monro. 1951. A stochastic approximation method. The Annals of Mathematical Statistics 22 (3):400–7. doi: https://doi.org/10.1214/aoms/1177729586.
- Rubin, D. B. 1978. Bayesian inference for causal effects: The role of randomization. Annals of Statistics 7:34–58.
- Russell, S., and P. Norvig. 2021. Artificial intelligence: A modern approach. 4th ed. Pearson.
- Sergin, N. D., and H. Yan. 2021. Toward a better monitoring statistic for profile monitoring via variational autoencoders. Journal of Quality Technology. doi: https://doi.org/10.1080/00224065.2021.1903821.
- Settles, B. 2009. Active learning literature survey. https://minds.wisconsin.edu/handle/1793/60660
- Shan, X., and D. W. Apley. 2008. Blind identification of manufacturing variation patterns by combining source separation criteria. Technometrics 50 (3):332–43. doi: https://doi.org/10.1198/004017008000000316.
- Shi, J. 2014. Stream of variations analysis. In Encyclopedia of systems and control, ed. J. Baillieul, T. Samad. London: Springer.
- Shortle, J. F., and M. B. Mendel. 2001. Physical foundations for lifetime distributions. In System and Bayesian reliability: Essays in honor of Professor Richard E Barlow on his 70th birthday, 257–66.
- Thompson, W., L. Hui, and A. Bolen. 2021. Artificial intelligence, machine learning, deep learning and more. SAS. https://www.sas.com/en_us/insights/articles/big-data/artificial-intelligence-machine-learning-deep-learning-and-beyond.html.
- Thomson, A. C. 1988. Real-time artificial intelligence for process monitoring and control. IFAC Proceedings Volumes 21 (13):67–72. doi: https://doi.org/10.1016/S1474-6670(17)53701-5.
- Turing, A. M., and J. Haugeland. 1950. Computing machinery and intelligence, 29–56. Cambridge, MA: MIT Press.
- Vapnik, V. N. 1998. Statistical learning theory. New York: Wiley.
- Wang, A., and J. Shi. 2021. Holistic modeling and analysis of multistage manufacturing processes with sparse effective inputs and mixed profile outputs. IISE Transactions 53 (5):582–596. doi: https://doi.org/10.1080/24725854.2020.1786197.
- Wasserman, S., and K. Faust. 1994. Social network analysis: Methods and applications. Cambridge University Press.
- Weese, M., M. Waldyn, M. M. Fadel, and A. Jones-Farmer. 2016. Statistical learning methods applied to process monitoring: An overview and perspective. Journal of Quality Technology 48 (1):4–24. doi: https://doi.org/10.1080/00224065.2016.11918148.
- Woodall, W. H., and D. C. Montgomery. 2014. Some current directions in the theory and application of statistical process monitoring. Journal of Quality Technology 46 (1):78–94. doi: https://doi.org/10.1080/00224065.2014.11917955.
- Yan, H., K. Paynabar, and M. Pacella. 2019. Point cloud data analysis for process modeling and optimization. Technometrics 61 (3):385–395. doi: https://doi.org/10.1080/00401706.2018.1529628.
- Yan, H., N. D. Sergin, W. A. Brenneman, S. J. Lange, and S. Ba. 2021. Deep multistage multi-task learning for quality prediction of multistage manufacturing systems. Journal of Quality Technology. doi: https://doi.org/10.1080/00224065.2021.1903822.
- Zhang, L. W., J. Lin, B. Liu, Z. C. Zhang, X. H. Yan, and M. H. Wei. 2019. A review on deep learning applications in prognostics and health management. IEEE Access. 7:162415–38. doi: https://doi.org/10.1109/ACCESS.2019.2950985.
- Zorriassatine, F., and J. D. T. Tannock. 1998. A review of neural networks for statistical process control. Journal of Intelligent Manufacturing 9 (3):209–224. doi: https://doi.org/10.1023/A:1008818817588.