Abstract
A Bayesian framework for sequential classification on finite lattice models is described in which response distributions are allowed to vary according to experiment. Optimal rates of convergence in classification are established. Intuitive and computationally simple experiment selection rules are proposed, and it is shown that this class of rules attains optimal rates almost surely under general conditions. A simulation study demonstrates that sequential classification can be conducted efficiently on lattices, with potentially great savings in experiment adminstration while maintaining high classification accuracy. This framework can be applied to adaptive testing for cognitive assessment and to other sequential classification problems such as group testing when experimental response distributions depend on pool composition.
Notes
Recommended by K. Rekab