Abstract
A screening experiment attempts to identify a subset of important effects using a relatively small number of experimental runs. Given the limited run size and a large number of possible effects, penalized regression is a popular tool used to analyze screening designs. In particular, an automated implementation of the Gauss-Dantzig selector has been widely recommended to compare screening design construction methods. Here, we illustrate potential reproducibility issues that arise when comparing two-level screening designs via simulation, and recommend a graphical method, based on screening probabilities, which compares designs by evaluating them along the penalized regression solution path. This method can be implemented using simulation, or, in the case of lasso, by using exact local lasso sign recovery probabilities. Our approach circumvents the need to specify tuning parameters associated with regularization methods, leading to more reliable design comparisons. This article contains Supplementary Materials including code to implement the proposed methods.
Acknowledgements
We would like to thank the editor and two anonymous referees for their valuable feedback that improved this article.
Data availability statement
The authors confirm that the data supporting the findings of this study are available within the article [and/or] its Supplementary Materials.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Additional information
Notes on contributors
Kade Young
Dr. Kade Young is a recent graduate of North Carolina State University. His research interests include Design and analysis of experiments and penalized estimation.
Maria L. Weese
Dr. Maria Weese is an Associate Professor of Business Analytics at Miami University in Oxford, Ohio. Her research interests include design and analysis of experiments, process monitoring, and applications of analytics in practice.
Jonathan W. Stallrich
Dr. Jonathan Stallrich is an Associate Professor in the Department of Statistics at North Carolina State University. He earned his Ph.D. in Statistics from Virginia Tech in 2014. His research interests include design and analysis of screening experiments, computer experiments, functional data analysis, and variable selection. In 2021, he and his coauthors were awarded the American Statistical Association’s Statistics in Physical Engineering Sciences Award for the paper, “Optimal EMG placement for a robotic prosthesis controller with sequential, adaptive functional estimation.” He is currently chair of the American Statistical Association’s Section on Physical and Engineering Sciences.
Byran J. Smucker
Dr. Byran J. Smucker is an Associate Professor of Statistics at Miami University in Oxford, Ohio. His research interests include the design and analysis of experiments, as well as applications of experiments, optimization and predictive modeling.
David J. Edwards
Dr. David J. Edwards is Professor of Statistics and Chair of Statistical Sciences and Operations Research at Virginia Commonwealth University. He is Editor-in-Chief of Quality Engineering and a former associate editor of Technometrics. His research interests are design and analysis of experiments, response surface methodology, and model selection.