Abstract
We prove an analogue of the classical zero-one law for both homogeneous and nonhomogeneous Markov chains (MC). Its almost precise formulation is simple: given any event A from the tail σ-algebra of MC , for large n, with probability near one, the trajectories of the MC are in states i, where is either near 0 or near 1. A similar statement holds for the entrance σ-algebra when n tends to . To formulate this second result, we give detailed results on the existence of nonhomogeneous Markov chains indexed by or in both the finite and countable cases. This extends a well-known result due to Kolmogorov. Further, in our discussion, we note an interesting dichotomy between two commonly used definitions of MCs.
Disclosure statement
No potential conflict of interest was reported by the author(s).