Abstract
Using a probability inequality, we show that if U = ([Udot], Ü), [Udot] = (U 1, …, U k), Ü = (U k + 1, …, U 2k ) has a multivariate normal distribution with mean vector zero and covariance matrices cov([Udot], [Udot]) = cov(Ü, Ü) = R, cov([Udot], Ü) = e'e – R, where R is a k × k correlation matrix and e = (1, …, 1), then P{|U 1| < c, …, |U 2k | < c} is minimized by setting all correlations in R equal to ½. This result is used to adjust the apparent significance of a difference between a treatment and a control for a subgroup in a clinical trial when the difference for this subgroup has been selected as the most significant of the differences for the 2k subgroups defined by k dichotomies.